WorldWideScience

Sample records for current risk estimates

  1. Eruptive history, current activity and risk estimation using geospatial information in the Colima volcano, Mexico

    Science.gov (United States)

    Suarez-Plascencia, C.; Camarena-Garcia, M.; Nunez-Cornu, F. J.; Flores-Peña, S.

    2013-12-01

    Colima volcano, also known as Volcan de Fuego (19 30.696 N, 103 37.026 W), is located on the border between the states of Jalisco and Colima, and is the most active volcano in Mexico. In January 20, 1913, Colima had its biggest explosion of the twentieth century, with VEI 4, after the volcano had been dormant for almost 40 years. In 1961, a dome reached the northeastern edge of the crater and started a new lava flow, and from this date maintains constant activity. In February 10, 1999, a new explosion occurred at the summit dome. The activity during the 2001-2005 period was the most intense, but did not exceed VEI 3. The activity resulted in the formation of domes and their destruction after explosive events. The explosions originated eruptive columns, reaching altitudes between 4,500 and 9,000 masl, further pyroclastic flows reaching distances up to 3.5 km from the crater. During the explosive events, ash emissions were generated in all directions reaching distances up to 100 km, slightly affecting the nearby villages: Tuxpan, Tonila, Zapotlan, Cuauhtemoc, Comala, Zapotitlan de Vadillo and Toliman. During 2005 to July 2013, this volcano has had an intense effusive-explosive activity; similar to the one that took place during the period of 1890 through 1905. That was before the Plinian eruption of 1913, where pyroclastic flows reached a distance of 15 km from the crater. In this paper we estimate the risk of Colima volcano through the analysis of the vulnerability variables, hazard and exposure, for which we use: satellite imagery, recurring Fenix helicopter over flights of the state government of Jalisco, the use of the images of Google Earth and the population census 2010 INEGI. With this information and data identified changes in economic activities, development, and use of land. The expansion of the agricultural frontier in the lower sides of the volcano Colima, and with the advancement of traditional crops of sugar cane and corn, increased the growth of

  2. Unbiased risk estimation method for covariance estimation

    CERN Document Server

    Lescornel, Hélène; Chabriac, Claudie

    2011-01-01

    We consider a model selection estimator of the covariance of a random process. Using the Unbiased Risk Estimation (URE) method, we build an estimator of the risk which allows to select an estimator in a collection of model. Then, we present an oracle inequality which ensures that the risk of the selected estimator is close to the risk of the oracle. Simulations show the efficiency of this methodology.

  3. Multispacecraft current estimates at swarm

    DEFF Research Database (Denmark)

    Dunlop, M. W.; Yang, Y.-Y.; Yang, J.-Y.

    2015-01-01

    During the first several months of the three-spacecraft Swarm mission all three spacecraft camerepeatedly into close alignment, providing an ideal opportunity for validating the proposed dual-spacecraftmethod for estimating current density from the Swarm magnetic field data. Two of the Swarm...... orbit the use oftime-shifted positions allow stable estimates of current density to be made and can verify temporal effects aswell as validating the interpretation of the current components as arising predominantly from field-alignedcurrents. In the case of four-spacecraft configurations we can resolve...... the full vector current and therefore cancheck the perpendicular as well as parallel current density components directly, together with the qualityfactor for the estimates directly (for the first time in situ at low Earth orbit)....

  4. Risk estimates for bone

    Energy Technology Data Exchange (ETDEWEB)

    Schlenker, R.A.

    1981-01-01

    The primary sources of information on the skeletal effects of internal emitters in humans are the US radium cases with occupational and medical exposures to /sup 226/ /sup 228/Ra and the German patients injected with /sup 224/Ra primarily for treatment of ankylosing spondylitis and tuberculosis. During the past decade, dose-response data from both study populations have been used by committees, e.g., the BEIR committees, to estimate risks at low dose levels. NCRP Committee 57 and its task groups are now engaged in making risk estimates for internal emitters. This paper presents brief discussions of the radium data, the results of some new analyses and suggestions for expressing risk estimates in a form appropriate to radiation protection.

  5. Estimating Risk Parameters

    OpenAIRE

    Aswath Damodaran

    1999-01-01

    Over the last three decades, the capital asset pricing model has occupied a central and often controversial place in most corporate finance analysts’ tool chests. The model requires three inputs to compute expected returns – a riskfree rate, a beta for an asset and an expected risk premium for the market portfolio (over and above the riskfree rate). Betas are estimated, by most practitioners, by regressing returns on an asset against a stock index, with the slope of the regression being the b...

  6. Cardiovascular risk estimation in older persons

    DEFF Research Database (Denmark)

    Cooney, Marie Therese; Selmer, Randi; Lindman, Anja

    2016-01-01

    model and were included in the SCORE O.P. model were: age, total cholesterol, high-density lipoprotein cholesterol, systolic blood pressure, smoking status and diabetes. SCORE O.P. showed good discrimination; area under receiver operator characteristic curve (AUROC) 0.74 (95% confidence interval: 0......AIMS: Estimation of cardiovascular disease risk, using SCORE (Systematic COronary Risk Evaluation) is recommended by European guidelines on cardiovascular disease prevention. Risk estimation is inaccurate in older people. We hypothesized that this may be due to the assumption, inherent in current...... risk estimation systems, that risk factors function similarly in all age groups. We aimed to derive and validate a risk estimation function, SCORE O.P., solely from data from individuals aged 65 years and older. METHODS AND RESULTS: 20,704 men and 20,121 women, aged 65 and over and without pre...

  7. Quantitative estimation of genetic risk for atypical scrapie in French sheep and potential consequences of the current breeding programme for resistance to scrapie on the risk of atypical scrapie

    Directory of Open Access Journals (Sweden)

    Laurent Pascal

    2010-05-01

    Full Text Available Abstract Background Since 2002, active surveillance programmes have detected numerous atypical scrapie (AS and classical scrapie cases (CS in French sheep with almost all the PrP genotypes. The aim of this study was 1 to quantify the genetic risk of AS in French sheep and to compare it with the risk of CS, 2 to quantify the risk of AS associated with the increase of the ARR allele frequency as a result of the current genetic breeding programme against CS. Methods We obtained genotypes at codons 136, 141, 154 and 171 of the PRNP gene for representative samples of 248 AS and 245 CS cases. We used a random sample of 3,317 scrapie negative animals genotyped at codons 136, 154 and 171 and we made inferences on the position 141 by multiple imputations, using external data. To estimate the risk associated with PrP genotypes, we fitted multivariate logistic regression models and we estimated the prevalence of AS for the different genotypes. Then, we used the risk of AS estimated for the ALRR-ALRR genotype to analyse the risk of detecting an AS case in a flock homogenous for this genotype. Results Genotypes most at risk for AS were those including an AFRQ or ALHQ allele while genotypes including a VLRQ allele were less commonly associated with AS. Compared to ALRQ-ALRQ, the ALRR-ALRR genotype was significantly at risk for AS and was very significantly protective for CS. The prevalence of AS among ALRR-ALRR animals was 0.6‰ and was not different from the prevalence in the general population. Conclusion In conclusion, further selection of ALRR-ALRR animals will not result in an overall increase of AS prevalence in the French sheep population although this genotype is clearly susceptible to AS. However the probability of detecting AS cases in flocks participating in genetic breeding programme against CS should be considered.

  8. [Medical insurance estimation of risks].

    Science.gov (United States)

    Dunér, H

    1975-11-01

    The purpose of insurance medicine is to make a prognostic estimate of medical risk-factors in persons who apply for life, health, or accident insurance. Established risk-groups with a calculated average mortality and morbidity form the basis for premium rates and insurance terms. In most cases the applicant is accepted for insurance after a self-assessment of his health. Only around one per cent of the applications are refused, but there are cases in which the premium is raised, temporarily or permanently. It is often a matter of rough estimate, since the knowlege of the long-term prognosis for many diseases is incomplete. The insurance companies' rules for estimate of risk are revised at intervals of three or four years. The estimate of risk as regards life insurance has been gradually liberalised, while the medical conditions for health insurance have become stricter owing to an increase in the claims rate.

  9. Estimating risks of perinatal death.

    Science.gov (United States)

    Smith, Gordon C S

    2005-01-01

    The relative and absolute risks of perinatal death that are estimated from observational studies are used frequently in counseling about obstetric intervention. The statistical basis for these estimates therefore is crucial, but many studies are seriously flawed. In this review, a number of aspects of the approach to the estimation of the risk of perinatal death are addressed. Key factors in the analysis include (1) the definition of the cause of the death, (2) differentiation between antepartum and intrapartum events, (3) the use of the appropriate denominator for the given cause of death, (4) the assessment of the cumulative risk where appropriate, (5) the use of appropriate statistical tests, (6) the stratification of analysis of delivery-related deaths by gestational age, and (7) the specific features of multiple pregnancy, which include the correct determination of the timing of antepartum stillbirth and the use of paired statistical tests when outcomes are compared in relation to the birth order of twin pairs.

  10. Software risk estimation and management techniques at JPL

    Science.gov (United States)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  11. Estimating the Risks of Breast Cancer Radiotherapy

    DEFF Research Database (Denmark)

    Taylor, Carolyn; Correa, Candace; Duane, Frances K

    2017-01-01

    and cause-specific mortality and excess RRs (ERRs) per Gy for incident lung cancer and cardiac mortality. Smoking status was unavailable. Third, the lung or heart ERRs per Gy in the trials and the 2010 to 2015 doses were combined and applied to current smoker and nonsmoker lung cancer and cardiac mortality.......06) ERR per Gy whole-heart dose. Estimated absolute risks from modern radiotherapy were as follows: lung cancer, approximately 4% for long-term continuing smokers and 0.3% for nonsmokers; and cardiac mortality, approximately 1% for smokers and 0.3% for nonsmokers. Conclusion For long-term smokers......Purpose Radiotherapy reduces the absolute risk of breast cancer mortality by a few percentage points in suitable women but can cause a second cancer or heart disease decades later. We estimated the absolute long-term risks of modern breast cancer radiotherapy. Methods First, a systematic literature...

  12. Current Chemical Risk Reduction Activities

    Science.gov (United States)

    EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.

  13. Current Source Density Estimation for Single Neurons

    Directory of Open Access Journals (Sweden)

    Dorottya Cserpán

    2014-03-01

    Full Text Available Recent developments of multielectrode technology made it possible to measure the extracellular potential generated in the neural tissue with spatial precision on the order of tens of micrometers and on submillisecond time scale. Combining such measurements with imaging of single neurons within the studied tissue opens up new experimental possibilities for estimating distribution of current sources along a dendritic tree. In this work we show that if we are able to relate part of the recording of extracellular potential to a specific cell of known morphology we can estimate the spatiotemporal distribution of transmembrane currents along it. We present here an extension of the kernel CSD method (Potworowski et al., 2012 applicable in such case. We test it on several model neurons of progressively complicated morphologies from ball-and-stick to realistic, up to analysis of simulated neuron activity embedded in a substantial working network (Traub et al, 2005. We discuss the caveats and possibilities of this new approach.

  14. GlobCurrent- Multisensor Synergy for Surface Current Estimation

    Science.gov (United States)

    Johannessen, J. A.; Chapron, B.; Collard, F.; Rio, M.-H.; Piolle, J.-F.; Gaultier, L.; Quartly, G.; Shutler, J.; Escola, R.; Raj, R. P.; Donlon, C.; Danielson, R.; Korosov, A.; Nencioli, F.; Kudryavtsev, V.; Roca, M.; Tournadre, J.; Larnicol, G.; Guitton, G.; Miller, P.; Warren, M.; Hansen, M.

    2016-08-01

    The GlobCurrent project (http://www.globcurrent.org) aims to: (i) advance the quantitative estimation of ocean surface currents from satellite sensor synergy; and (ii) demonstrate impact in user-led scientific, operational and commercial applications that, in turn, will improve and strengthen the uptake of satellite measurements. It is often demonstrated that sharp gradients in the sea surface temperature (SST) and current fields and the ocean surface chlorophyll-a distribution are spatially correlated with the sea surface roughness anomaly fields at small spatial scales, in the sub-mesocale (1-10 km) to the mesoscale (30-80 km). At the larger mesoscale range (>50 km), information derived from radar altimeters often depict the presence of coherent structures and eddies. The variability often appears largest in regions where the intense surface current regimes (>100 - 200 km) are found. These 2- dimensional structures manifested in the satellite observations represent evidence of the upper ocean ( 100-200 m) dynamics. Whereas the quasi geostrophic assumption is valid for the upper ocean dynamics at the larger scale (>100 km), possible triggering mechanisms for the expressions at the mesoscale-to-submesoscale may include spiraling tracers of inertial motion and the interaction of the wind-driven Ekman layer with the quasi-geostrophic current field. This latter, in turn, produces bands of downwelling (convergence) and upwelling (divergence) near fronts. A regular utilization of the sensor synergy approach with the combination of Sentinel-3, Sentinel-2 and Sentinel-1 together with other satellite missions will provide a highly valuable data set for further research and development to better relate the 2-dimensional surface expressions and the upper ocean dynamics.

  15. Current best estimates of planet populations

    Science.gov (United States)

    Rogers, Leslie A.

    2016-05-01

    Exoplanets are revolutionizing planetary science by enabling statistical studies of a large number of planets. Empirical measurements of planet occurrence rates inform our understanding of the ubiquity and efficiency of planet formation, while the identification of sub-populations and trends in the distribution of observed exoplanet properties provides insights into the formation and evolution processes that are sculpting distant Solar Systems. In this paper, we review the current best estimates of planet populations. We focus in particular on η⊕, the occurrence rate of habitable zone rocky planets, since this factor strongly influences the design of future space based exoplanet direct detection missions.

  16. Sovereign Risk and Currently Returns

    DEFF Research Database (Denmark)

    Della Corte, Pasquale; Sarno, Lucio; Schmeling, Maik

    explanatory power for currency returns which is largely driven by shocks to global credit risk. Consistent with the notion that sovereign risk is priced, we find that a country's exposure to global credit risk forecasts excess returns to trading exchange rates as well as to trading on the volatility, skewness......We empirically investigate the relation between sovereign risk and exchange rates for a broad set of currencies. An increase in the credit default swap (CDS) spread of a country is accompanied by a significant depreciation of the exchange rate. More generally, CDS spread changes have substantial......, and kurtosis of currency returns....

  17. THE CURRENT STATE OF RISK-CONTROLLING

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2014-04-01

    Full Text Available The concept of risk-controlling is based on the general theory of risk. The current state of risk-management in our country is reviewed. We also discuss the research on risk-controlling made in the BMSTU Laboratory of economic-mathematical methods in controlling

  18. Sovereign Risk and Currently Returns

    DEFF Research Database (Denmark)

    Della Corte, Pasquale; Sarno, Lucio; Schmeling, Maik

    We empirically investigate the relation between sovereign risk and exchange rates for a broad set of currencies. An increase in the credit default swap (CDS) spread of a country is accompanied by a significant depreciation of the exchange rate. More generally, CDS spread changes have substantial...

  19. Annual effective dose due to residential radon progeny in Sweden: Evaluations based on current risk projections models and on risk estimates from a nation-wide Swedish epidemiological study

    Energy Technology Data Exchange (ETDEWEB)

    Doi, M. [National Inst. of Radiological Sciences, Chiba (Japan); Lagarde, F. [Karolinska Inst., Stockholm (Sweden). Inst. of Environmental Medicine; Falk, R.; Swedjemark, G.A. [Swedish Radiation Protection Inst., Stockholm (Sweden)

    1996-12-01

    Effective dose per unit radon progeny exposure to Swedish population in 1992 is estimated by the risk projection model based on the Swedish epidemiological study of radon and lung cancer. The resulting values range from 1.29 - 3.00 mSv/WLM and 2.58 - 5.99 mSv/WLM, respectively. Assuming a radon concentration of 100 Bq/m{sup 3}, an equilibrium factor of 0.4 and an occupancy factor of 0.6 in Swedish houses, the annual effective dose for the Swedish population is estimated to be 0.43 - 1.98 mSv/year, which should be compared to the value of 1.9 mSv/year, according to the UNSCEAR 1993 report. 27 refs, tabs, figs.

  20. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  1. Estimating Cascading Failure Risk with Random Chemistry

    CERN Document Server

    Rezaei, Pooya; Eppstein, Margaret J

    2014-01-01

    The potential for cascading failure in power systems adds substantially to overall reliability risk. Monte Carlo sampling can be used with a power system model to estimate this impact, but doing so is computationally expensive. This paper presents a new approach to estimating the risk of large cascading blackouts triggered by multiple contingencies. The method uses a search algorithm (Random Chemistry) to identify blackout-causing contingencies, and then combines the results with outage probabilities to estimate overall risk. Comparing this approach with Monte Carlo sampling for two test cases (the IEEE RTS-96 and a 2383 bus model of the Polish grid) suggests that the new approach is at least two orders of magnitude faster than Monte Carlo, without introducing measurable bias. Moreover, the approach enables one to compute the contribution of individual component-failure probabilities to overall blackout risk, allowing one to quickly identify low-cost strategies for reducing risk. By computing the sensitivity ...

  2. Current error estimates for LISA spurious accelerations

    Energy Technology Data Exchange (ETDEWEB)

    Stebbins, R T [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Bender, P L [JILA-University of Colorado, Boulder, CO (United States); Hanson, J [Stanford University, Stanford, CA (United States); Hoyle, C D [University of Trento, Trento (Italy); Schumaker, B L [Jet Propulsion Laboratory, Pasadena, CA (United States); Vitale, S [University of Trento, Trento (Italy)

    2004-03-07

    The performance of the LISA gravitational wave detector depends critically on limiting spurious accelerations of the fiducial masses. Consequently, the requirements on allowable acceleration levels must be carefully allocated based on estimates of the achievable limits on spurious accelerations from all disturbances. Changes in the allocation of requirements are being considered, and are proposed here. The total spurious acceleration error requirement would remain unchanged, but a few new error sources would be added, and the allocations for some specific error sources would be changed. In support of the recommended revisions in the requirements budget, estimates of plausible acceleration levels for 17 of the main error sources are discussed. In most cases, the formula for calculating the size of the effect is known, but there may be questions about the values of various parameters to use in the estimates. Different possible parameter values have been discussed, and a representative set is presented. Improvements in our knowledge of the various experimental parameters will come from planned experimental and modelling studies, supported by further theoretical work.

  3. Estimates of current debris from flux models

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-01-01

    Flux models that balance accuracy and simplicity are used to predict the growth of space debris to the present. Known and projected launch rates, decay models, and numerical integrations are used to predict distributions that closely resemble the current catalog-particularly in the regions containing most of the debris.

  4. Current features on risk perception and risk communication of radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kusama, Tomoko [Tokyo Univ. (Japan). Faculty of Medicine

    1997-03-01

    Health effects and risks of radiation and radionuclides are being misunderstood by many members of general public. Many peoples have fear and anxieties for radiation. So far, the health effects from radiation at low dose and low dose rate have not been cleared on biological aspects. Then, we have quantitatively estimated health risks of low-dose radiation on the basis of linear dose response relationship without threshold from the viewpoints of radiation protection by using both epidemiological data, such as atomic bomb survivors, and some models and assumptions. It is important for researchers and relevant persons in radiation protection to understand the process of risk estimation of radiation and to communicate an exact knowledge of radiation risks of the public members. (author)

  5. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth;

    2003-01-01

    from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based......biquitous computing environments are highly dynamic, with new unforeseen circumstances and constantly changing environments, which introduces new risks that cannot be assessed through traditional means of risk analysis. Mobile entities in a ubiquitous computing environment require the ability...... to perform an autonomous assessment of the risk incurred by a specific interaction with another entity in a given context. This assessment will allow a mobile entity to decide whether sufficient evidence exists to mitigate the risk and allow the interaction to proceed. Such evidence might include records...

  6. An immunity based network security risk estimation

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2005-01-01

    According to the relationship between the antibody concentration and the pathogen intrusion intensity, here we present an immunity-based model for the network security risk estimation (Insre). In Insre, the concepts and formal definitions of self,nonself, antibody, antigen and lymphocyte in the network security domain are given. Then the mathematical models of the self-tolerance, the clonal selection, the lifecycle of mature lymphocyte, immune memory and immune surveillance are established. Building upon the above models, a quantitative computation model for network security risk estimation,which is based on the calculation of antibody concentration, is thus presented. By using Insre, the types and intensity of network attacks, as well as the risk level of network security, can be calculated quantitatively and in real-time. Our theoretical analysis and experimental results show that Insre is a good solution to real-time risk evaluation for the network security.

  7. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    Science.gov (United States)

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  8. Spatial ascariasis risk estimation using socioeconomic variables.

    Science.gov (United States)

    Valencia, Luis Iván Ortiz; Fortes, Bruno de Paula Menezes Drumond; Medronho, Roberto de Andrade

    2005-12-01

    Frequently, disease incidence is mapped as area data, for example, census tracts, districts or states. Spatial disease incidence can be highly heterogeneous inside these areas. Ascariasis is a highly prevalent disease, which is associated with poor sanitation and hygiene. Geostatistics was applied to model spatial distribution of Ascariasis risk and socioeconomic risk events in a poor community in Rio de Janeiro, Brazil. Data were gathered from a coproparasitologic and a domiciliary survey in 1550 children aged 1-9. Ascariasis risk and socioeconomic risk events were spatially estimated using Indicator Kriging. Cokriging models with a Linear Model of Coregionalization incorporating one socioeconomic variable were implemented. If a housewife attended school for less than four years, the non-use of a home water filter, a household density greater than one, and a household income lower than one Brazilian minimum wage increased the risk of Ascariasis. Cokriging improved spatial estimation of Ascariasis risk areas when compared to Indicator Kriging and detected more Ascariasis very-high risk areas than the GIS Overlay method.

  9. Risk estimation in partial duration series

    Science.gov (United States)

    Rasmussen, Peter Funder; Rosbjerg, Dan

    1989-11-01

    The estimation of design floods is, in practice, often based on small samples of data, which may cause a severe uncertainty. For a particular version of the partial duration series (exponentially distributed exceedances and Poissonian occurrence times) the distribution of the T-year design estimate xˆT is derived along with the distribution of RT; defined as the true risk of exceeding xˆT within a given disposal period. For a fixed flood level the distributions of the return period estimator Tˆ and the estimator of the risk in lifetime Rˆ are also presented. Analytical closed-form expressions for mean value and standard deviation are derived for these variables, except for Tˆ, which does not possess moments. The concept of "expected risk" is introduced, and an analytical expression describing this property is derived. A risk-based design technique, which is essentially different from the traditional procedure, is presented, and its applicability is verified using Monte Carlo simulation.

  10. Induction Motor Speed Estimation by Using Spectral Current Analysis

    OpenAIRE

    2009-01-01

    An interesting application for the FFT analysis is related to the induction motor speed estimation based on spectral current analysis. The paper presents the possibility of induction motor speed estimation by using the current harmonics generated because of the rotor slots and of the eccentricity.

  11. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide

    DEFF Research Database (Denmark)

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during...... admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors......People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time...

  12. Estimating Terrorist Risk with Possibility Theory

    Energy Technology Data Exchange (ETDEWEB)

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  13. Risk Estimation Methodology for Launch Accidents.

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  14. MARKET RISK ESTIMATION IN (T+-TRANSACTIONS

    Directory of Open Access Journals (Sweden)

    Radik B. Begov

    2016-01-01

    Full Text Available Market risk analysis and estimation are presentedin T+ transactionsas they are used within the Moscow Exchange. There is a need to do so as a result of the cut-off of a new REPO product with Central Counterpartner (CCP. Here repurchase agreement goes through the National Clearing Center (NCC, the last being a bank and a clearing structure within the Moscow Exchange group.NCC actsas an intermediary (so called “Central Counterpartner” between trading participants.REPOs with CCP raisecontractor claims and commitments to the CCP which takes the risk of default on commitments from unfair contract side. The REPO with CCP cut-off made ready a technological platform to implement T+2 trades at the Moscow Exchange. As a result of it there appeared the possibility to enter security purchase/sell contracts partially collateralized. All these transactions (the REPO with CCP, T+ made it a must determining security market risks. The paper is aimed at presenting VaR-like risk estimates. The methods used are from the computer fi nance. Unusual TS rate of return indicator is proposed and applied to find optimal portfolios under the Markowitz approach and their VaRs (losses forecasts given the real “big” share price data and various horizons. Portfolio extreme rate and loss forecasting is our goal. To this end the forecasts are computed for three horizons (2, 5 and 10 days and for three significance levels.There were developed R-, Excel- and Bloomberg-basedsoftware tools as needed. The whole range of proposed computing steps and the tables with charts may be considered as candidates to be included in the future market risk standards.Paper results permit capital market participants to choose the correct (as to the required risk level common stocks.

  15. Current status data with competing risks: Consistency and rates of convergence of the MLE

    NARCIS (Netherlands)

    Groeneboom, P.; Maathuis, M.H.; Wellner, J.A.

    2008-01-01

    We study nonparametric estimation of the sub-distribution functions for current status data with competing risks. Our main interest is in the nonparametric maximum likelihood estimator (MLE), and for comparison we also consider a simpler “naive estimator.” Both types of estimators were studied by Je

  16. Leakage Current Estimation of CMOS Circuit with Stack Effect

    Institute of Scientific and Technical Information of China (English)

    Yong-Jun Xu; Zu-Ying Luo; Xiao-Wei Li; Li-Jian Li; Xian-Long Hong

    2004-01-01

    Leakage current of CMOS circuit increases dramatically with the technology scaling down and has become a critical issue of high performance system. Subthreshold, gate and reverse biased junction band-to-band tunneling (BTBT) leakages are considered three main determinants of total leakage current. Up to now, how to accurately estimate leakage current of large-scale circuits within endurable time remains unsolved, even though accurate leakage models have been widely discussed. In this paper, the authors first dip into the stack effect of CMOS technology and propose a new simple gate-level leakage current model. Then, a table-lookup based total leakage current simulator is built up according to the model. To validate the simulator, accurate leakage current is simulated at circuit level using popular simulator HSPICE for comparison. Some further studies such as maximum leakage current estimation, minimum leakage current generation and a high-level average leakage current macromodel are introduced in detail. Experiments on ISCAS85 and ISCAS89 benchmarks demonstrate that the two proposed leakage current estimation methods are very accurate and efficient.

  17. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2012-01-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  18. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2014-07-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  19. Reconstruction of financial networks for robust estimation of systemic risk

    Science.gov (United States)

    Mastromatteo, Iacopo; Zarinelli, Elia; Marsili, Matteo

    2012-03-01

    In this paper we estimate the propagation of liquidity shocks through interbank markets when the information about the underlying credit network is incomplete. We show that techniques such as maximum entropy currently used to reconstruct credit networks severely underestimate the risk of contagion by assuming a trivial (fully connected) topology, a type of network structure which can be very different from the one empirically observed. We propose an efficient message-passing algorithm to explore the space of possible network structures and show that a correct estimation of the network degree of connectedness leads to more reliable estimations for systemic risk. Such an algorithm is also able to produce maximally fragile structures, providing a practical upper bound for the risk of contagion when the actual network structure is unknown. We test our algorithm on ensembles of synthetic data encoding some features of real financial networks (sparsity and heterogeneity), finding that more accurate estimations of risk can be achieved. Finally we find that this algorithm can be used to control the amount of information that regulators need to require from banks in order to sufficiently constrain the reconstruction of financial networks.

  20. Variance computations for functional of absolute risk estimates.

    Science.gov (United States)

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  1. Simplifying cardiovascular risk estimation using resting heart rate.

    LENUS (Irish Health Repository)

    Cooney, Marie Therese

    2010-09-01

    Elevated resting heart rate (RHR) is a known, independent cardiovascular (CV) risk factor, but is not included in risk estimation systems, including Systematic COronary Risk Evaluation (SCORE). We aimed to derive risk estimation systems including RHR as an extra variable and assess the value of this addition.

  2. Current status data with competing risks: limiting distribution of the MLE

    NARCIS (Netherlands)

    Groeneboom, P.; Maathuis, M.H.; Wellner, J.A.

    2008-01-01

    We study nonparametric estimation for current status data with competing risks. Our main interest is in the nonparametric maximum likelihood estimator (MLE), and for comparison we also consider a simpler “naive estimator.” Groeneboom, Maathuis and Wellner [Ann. Statist. (2008) 36 1031–1063] proved t

  3. Estimation of earthquake risk curves of physical building damage

    Science.gov (United States)

    Raschke, Mathias; Janouschkowetz, Silke; Fischer, Thomas; Simon, Christian

    2014-05-01

    In this study, a new approach to quantify seismic risks is presented. Here, the earthquake risk curves for the number of buildings with a defined physical damage state are estimated for South Africa. Therein, we define the physical damage states according to the current European macro-seismic intensity scale (EMS-98). The advantage of such kind of risk curve is that its plausibility can be checked more easily than for other types. The earthquake risk curve for physical building damage can be compared with historical damage and their corresponding empirical return periods. The number of damaged buildings from historical events is generally explored and documented in more detail than the corresponding monetary losses. The latter are also influenced by different economic conditions, such as inflation and price hikes. Further on, the monetary risk curve can be derived from the developed risk curve of physical building damage. The earthquake risk curve can also be used for the validation of underlying sub-models such as the hazard and vulnerability modules.

  4. Estimating Worker Risk Levels Using Accident/Incident Data

    Energy Technology Data Exchange (ETDEWEB)

    Kenoyer, Judson L.; Stenner, Robert D.; Andrews, William B.; Scherpelz, Robert I.; Aaberg, Rosanne L.

    2000-09-26

    The purpose of the work described in this report was to identify methods that are currently being used in the Department of Energy (DOE) complex to identify and control hazards/risks in the workplace, evaluate them in terms of their effectiveness in reducing risk to the workers, and to develop a preliminary method that could be used to predict the relative risks to workers performing proposed tasks using some of the current methodology. This report describes some of the performance indicators (i.e., safety metrics) that are currently being used to track relative levels of workplace safety in the DOE complex, how these fit into an Integrated Safety Management (ISM) system, some strengths and weaknesses of using a statistically based set of indicators, and methods to evaluate them. Also discussed are methods used to reduce risk to the workers and some of the techniques that appear to be working in the process of establishing a condition of continuous improvement. The results of these methods will be used in future work involved with the determination of modifying factors for a more complex model. The preliminary method to predict the relative risk level to workers during an extended future time period is based on a currently used performance indicator that uses several factors tracked in the CAIRS. The relative risks for workers in a sample (but real) facility on the Hanford site are estimated for a time period of twenty years and are based on workforce predictions. This is the first step in developing a more complex model that will incorporate other modifying factors related to the workers, work environment and status of the ISM system to adjust the preliminary prediction.

  5. Future flood risk estimates along the river Rhine

    Directory of Open Access Journals (Sweden)

    A. H. te Linde

    2011-02-01

    Full Text Available In Europe, water management is moving from flood defence to a risk management approach, which takes both the probability and the potential consequences of flooding into account. It is expected that climate change and socio-economic development will lead to an increase in flood risk in the Rhine basin. To optimize spatial planning and flood management measures, studies are needed that quantify future flood risks and estimate their uncertainties. In this paper, we estimated the current and future fluvial flood risk in 2030 for the entire Rhine basin in a scenario study. The change in value at risk is based on two land-use projections derived from a land-use model representing two different socio-economic scenarios. Potential damage was calculated by a damage model, and changes in flood probabilities were derived from two climate scenarios and hydrological modeling. We aggregated the results into seven sections along the Rhine. It was found that the annual expected damage in the Rhine basin may increase by between 54% and 230%, of which the major part (~ three-quarters can be accounted for by climate change. The highest current potential damage can be found in the Netherlands (110 billion €, compared with the second (80 billion € and third (62 billion € highest values in two areas in Germany. Results further show that the area with the highest fluvial flood risk is located in the Lower Rhine in Nordrhein-Westfalen in Germany, and not in the Netherlands, as is often perceived. This is mainly due to the higher flood protection standards in the Netherlands as compared to Germany.

  6. An application of extreme value theory in estimating liquidity risk

    Directory of Open Access Journals (Sweden)

    Sonia Benito Muela

    2017-09-01

    Full Text Available The last global financial crisis (2007–2008 has highlighted the weaknesses of value at risk (VaR as a measure of market risk, as this metric by itself does not take liquidity risk into account. To address this problem, the academic literature has proposed incorporating liquidity risk into estimations of market risk by adding the VaR of the spread to the risk price. The parametric model is the standard approach used to estimate liquidity risk. As this approach does not generate reliable VaR estimates, we propose estimating liquidity risk using more sophisticated models based on extreme value theory (EVT. We find that the approach based on conditional extreme value theory outperforms the standard approach in terms of accurate VaR estimates and the market risk capital requirements of the Basel Capital Accord.

  7. Current practices by forensic anthropologists in adult skeletal age estimation.

    Science.gov (United States)

    Garvin, Heather M; Passalacqua, Nicholas V

    2012-03-01

    When determining an age estimate from adult skeletal remains, forensic anthropologists face a series of methodological choices. These decisions, such as which skeletal region to evaluate, which methods to apply, what statistical information to use, and how to combine information from multiple methods, ultimately impacts the final reported age estimate. In this study, a questionnaire was administered to 145 forensic anthropologists, documenting current trends in adult age at death estimation procedures used throughout the field. Results indicate that the Suchey-Brooks pubic symphysis method (1990) remains the most highly favored aging technique, with cranial sutures and dental wear being the least preferred, regardless of experience. The majority of respondents stated that they vary their skeletal age estimate process case-by-case and ultimately present to officials both a narrow and broad possible age range. Overall, respondents displayed a very high degree of variation in how they generate their age estimates, and indicated that experience and expertise play a large role in skeletal age estimates.

  8. Estimation of volumetric breast density for breast cancer risk prediction

    Science.gov (United States)

    Pawluczyk, Olga; Yaffe, Martin J.; Boyd, Norman F.; Jong, Roberta A.

    2000-04-01

    Mammographic density (MD) has been shown to be a strong risk predictor for breast cancer. Compared to subjective assessment by a radiologist, computer-aided analysis of digitized mammograms provides a quantitative and more reproducible method for assessing breast density. However, the current methods of estimating breast density based on the area of bright signal in a mammogram do not reflect the true, volumetric quantity of dense tissue in the breast. A computerized method to estimate the amount of radiographically dense tissue in the overall volume of the breast has been developed to provide an automatic, user-independent tool for breast cancer risk assessment. The procedure for volumetric density estimation consists of first correcting the image for inhomogeneity, then performing a volume density calculation. First, optical sensitometry is used to convert all images to the logarithm of relative exposure (LRE), in order to simplify the image correction operations. The field non-uniformity correction, which takes into account heel effect, inverse square law, path obliquity and intrinsic field and grid non- uniformity is obtained by imaging a spherical section PMMA phantom. The processed LRE image of the phantom is then used as a correction offset for actual mammograms. From information about the thickness and placement of the breast, as well as the parameters of a breast-like calibration step wedge placed in the mammogram, MD of the breast is calculated. Post processing and a simple calibration phantom enable user- independent, reliable and repeatable volumetric estimation of density in breast-equivalent phantoms. Initial results obtained on known density phantoms show the estimation to vary less than 5% in MD from the actual value. This can be compared to estimated mammographic density differences of 30% between the true and non-corrected values. Since a more simplistic breast density measurement based on the projected area has been shown to be a strong indicator

  9. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  10. Remote sensing techniques for vegetation moisture and fire risk estimation

    Science.gov (United States)

    Dasgupta, Swarvanu

    Moisture Experiment 2003 field campaign. Finally as an alternative to current subjective fire risk indices a new Fire Susceptibility Index (FSI) based on physical concept of pre-ignition energy was proposed. FSI uses remotely sensed estimations of fuel temperature and LFMC. Its physical basis is expected to allow computations of ignition probabilities and fire spread rates. FSI can be used compare fire risk across ecoregions and yet has the flexibility to be localized for an ecoregion for improved performance. A good agreement with the well tested FPI (Fire Potential Index) over Georgia, suggests the validity of FSI as a fire risk estimator. These new approaches would be helpful in fire risk monitoring, agriculture and climate studies.

  11. An integrated risk estimation methodology: Ship specific incident type risk

    NARCIS (Netherlands)

    S. Knapp (Sabine)

    2013-01-01

    textabstractShipping activity has increased worldwide, including parts of Australia, and maritime administrations are trying to gain a better understanding of total risk exposure in order to mitigate risk. Total risk exposure integrates risk at the individual ship level, risk due to vessel traffic d

  12. Estimation of current density distribution under electrodes for external defibrillation

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2002-12-01

    Full Text Available Abstract Background Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. Method and Results Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. Conclusion The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise.

  13. An application of extreme value theory in estimating liquidity risk

    OpenAIRE

    Sonia Benito Muela; Carmen López Martín; Raquel Arguedas Sanz

    2017-01-01

    The last global financial crisis (2007–2008) has highlighted the weaknesses of value at risk (VaR) as a measure of market risk, as this metric by itself does not take liquidity risk into account. To address this problem, the academic literature has proposed incorporating liquidity risk into estimations of market risk by adding the VaR of the spread to the risk price. The parametric model is the standard approach used to estimate liquidity risk. As this approach does not generate reliable VaR ...

  14. Genetic risk estimation by healthcare professionals

    NARCIS (Netherlands)

    B. Bonke (Benno); A. Tibben (Arend); D. Lindhout (Dick); A.J. Clarke (Angus); Th. Stijnen (Theo)

    2005-01-01

    textabstractOBJECTIVES: To assess whether healthcare professionals correctly incorporate the relevance of a favourable test outcome in a close relative when determining the level of risk for individuals at risk for Huntington's disease. DESIGN AND SETTING: Survey of clinical

  15. Estimation of radiation cancer risk in CT-KUB

    Science.gov (United States)

    Karim, M. K. A.; Hashim, S.; Bakar, K. A.; Bradley, D. A.; Ang, W. C.; Bahrudin, N. A.; Mhareb, M. H. A.

    2017-08-01

    The increased demand for computed tomography (CT) in radiological scanning examinations raises the question of a potential health impact from the associated radiation exposures. Focusing on CT kidney-ureter-bladder (CT-KUB) procedures, this work was aimed at determining organ equivalent dose using a commercial CT dose calculator and providing an estimate of cancer risks. The study, which included 64 patients (32 males and 32 females, mean age 55.5 years and age range 30-80 years), involved use of a calibrated CT scanner (Siemens-Somatom Emotion 16-slice). The CT exposures parameter including tube potential, pitch factor, tube current, volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded and analyzed using CT-EXPO (Version 2.3.1, Germany). Patient organ doses, including for stomach, liver, colon, bladder, red bone marrow, prostate and ovaries were calculated and converted into cancer risks using age- and sex-specific data published in the Biological Effects of Ionizing Radiation (BEIR) VII report. With a median value scan range of 36.1 cm, the CTDIvol, DLP, and effective dose were found to be 10.7 mGy, 390.3 mGy cm and 6.2 mSv, respectively. The mean cancer risks for males and females were estimated to be respectively 25 and 46 out of 100,000 procedures with effective doses between 4.2 mSv and 10.1 mSv. Given the increased cancer risks from current CT-KUB procedures compared to conventional examinations, we propose that the low dose protocols for unenhanced CT procedures be taken into consideration before establishing imaging protocols for CT-KUB.

  16. Treadmill motor current value based walk phase estimation.

    Science.gov (United States)

    Ohki, Eiichi; Nakashima, Yasutaka; Ando, Takeshi; Fujie, Masakatsu G

    2009-01-01

    We have developed a gait rehabilitation robot for hemiplegic patients using the treadmill. A walk phase, which includes time balance of stance and swing legs, is one of the most basic indexes to evaluate patients' gait. In addition, the walking phase is one of the indexes to control our robotic rehabilitation system. However, conventional methods to measure the walk phase require another system such as the foot switch and force plate. In this paper, an original algorithm to estimate the walk phase of a person on a treadmill using only the current value of DC motor to control the treadmill velocity is proposed. This algorithm was verified by experiments on five healthy subjects, and the walk phase of four subjects could be estimated in 0.2 (s) errors. However, the algorithm had erroneously identified a period of time in the stance phase as swing phase time when little body weight loaded on the subject's leg. Because a period of time with little body weight to affected leg is often observed in a hemiplegic walk, the proposed algorithm might fail to properly estimate the walk phase of hemiplegic patients. However, this algorithm could be used to estimate the time when body weight is loaded on patient legs, and thus could be used as a new quantitative evaluation index.

  17. Bank Liquidity Risk: Analysis and Estimates

    Directory of Open Access Journals (Sweden)

    Meilė Jasienė

    2012-12-01

    Full Text Available In today’s banking business, liquidity risk and its management are some of the most critical elements that underlie the stability and security of the bank’s operations, profit-making and clients confidence as well as many of the decisions that the bank makes. Managing liquidity risk in a commercial bank is not something new, yet scientific literature has not focused enough on different approaches to liquidity risk management and assessment. Furthermore, models, methodologies or policies of managing liquidity risk in a commercial bank have never been examined in detail either. The goal of this article is to analyse the liquidity risk of commercial banks as well as the possibilities of managing it and to build a liquidity risk management model for a commercial bank. The development, assessment and application of the commercial bank liquidity risk management was based on an analysis of scientific resources, a comparative analysis and mathematical calculations.

  18. High-dimensional inference by unbiased risk estimation

    Science.gov (United States)

    Chetelat, Didier

    This thesis derives natural and efficient solutions of three high-dimensional statistical problems by exploiting unbiased risk estimation. They exemplify a general methodology that provides attractive estimators in situations where classical theory is unsuccessful, and that could be exploited in many other problems. First, we extend the classical James-Stein shrinkage estimator to the context where the number of covariates is larger than the sample size and the covariance matrix is unknown. The construction is obtained by manipulating an unbiased risk estimator and shown to dominate in invariant squared loss the maximum likelihood estimator. The estimator is interpreted as performing shrinkage only the random subspace spanned by the sample covariance matrix. Second, we investigate the estimation of a covariance and precision matrix, and discriminant coefficients, of linearly dependent data in a normal framework. By boundingthe difference in risk over classes of interest using unbiased risk estimation, we construct interesting estimators and show domination over naive solutions. Finally, we explore the problem of estimating the noise coefficient in the spiked covariance model. By decomposing an unbiased risk estimator and minimizing its dominant part using calculus of variations, we obtain an estimator in closed form that approximates the optimal solution. Several attractive properties are proven about the proposed construction. We conclude by showing that the associated spiked covariance estimators possess excellent behavior under the Frobenius loss.

  19. Panel data nonparametric estimation of production risk and risk preferences

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We apply nonparametric panel data kernel regression to investigate production risk, out-put price uncertainty, and risk attitudes of Polish dairy farms based on a firm-level unbalanced panel data set that covers the period 2004–2010. We compare different model specifications and different...... approaches for obtaining firm-specific measures of risk attitudes. We found that Polish dairy farmers are risk averse regarding production risk and price uncertainty. According to our results, Polish dairy farmers perceive the production risk as being more significant than the risk related to output price...

  20. Estimation and reduction of harmonic currents from power converters

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian

    of estimation and reduction of the harmonic currents from industrial Adjustable Speed Drives, in order to come up with an optimized solution for customers. The work is structured in two main directions. The first direction consists of analyzing the mechanism of the harmonic current generation from ASD...... and harmonic rotating frame transformations, generalized harmonic integrators. Extensive simulations are employed in Matlab/Simulink and a laboratory stand is built to test the shunt APF topology. The actual implementation is done in fundamental rotating frame, where a new selective harmonic integrator....... Although the purpose of APF study was not finding a solution directly applicable for the Heat Power Station, there was an influence in the way the research line is further approached. As the cost of the APF is still relatively high, its utilization becomes more efficient for mediumor high...

  1. Genetic risk estimation by healthcare professionals

    OpenAIRE

    Bonke, Benno; Tibben, Arend; Lindhout, Dick; Clarke, Angus; Stijnen, Theo

    2005-01-01

    textabstractOBJECTIVES: To assess whether healthcare professionals correctly incorporate the relevance of a favourable test outcome in a close relative when determining the level of risk for individuals at risk for Huntington's disease. DESIGN AND SETTING: Survey of clinical geneticists and genetic counsellors from 12 centres of clinical genetics (United Kingdom, 6; The Netherlands, 4; Italy, 1; Australia, 1) in May-June 2002. Participants were asked to assess risk of specific individuals in ...

  2. Can we trust current estimates for biological nitrogen fixation?

    Science.gov (United States)

    Bellenger, Jean-Philippe; Kraepiel, Anne

    2016-04-01

    Biological nitrogen fixation (BNF) consists on the reduction of atmospheric dinitrogen (N2) into bioavailable ammonium. This reaction accounts for up to 97% of nitrogen (N) input in unmanaged terrestrial ecosystems. Closing the N budget is a long standing challenge in many ecosystems. Recent studies have highlighted that current methods used to assess BNF are affected by critical biases. These findings challenge our confidence in many N budgets and call for a profound reconsideration of our methodological approaches. Beside these methodological issues, our ability to properly assess BNF might be further altered as a result of a misconception regarding the importance of BNF enzymatic diversity in nature. BNF is catalyzed by the enzyme nitrogenase (Nase) for which three isoforms have been identified so far; the molybdenum (Mo), vanadium (V) and iron-only (Fe) isoforms. Currently BNF is mostly considered to primarily depend on the Mo isoform. The contribution of the alternative Nases (V and Fe isoforms) to BNF in natural habitats has been mostly overlooked. However, recent findings have challenged this traditional view of the Nases hierarchy (Mo isoform predominance) with deep implications for BNF assessment in the field. Here, I will present an overview of recent findings, provided by various research groups, challenging current methods used to assess BNF. I will also present a summary of recent studies highlighting the importance of alternative Nases in nature. I will finally illustrate how altering our view on the Mo-Nase predominance can deeply affect our confidence in current BNF estimates. I will conclude by presenting new methodological approaches that will contribute to significantly improve our ability to understand and estimate BNF in the field by improving our capacity to access BNF spatio-temporal variability and enzymatic diversity.

  3. Estimating risks for water-quality exceedances of total-copper from highway and urban runoff under predevelopment and current conditions with the Stochastic Empirical Loading and Dilution Model (SELDM)

    Science.gov (United States)

    Granato, Gregory; Jones, Susan C.; Dunn, Christopher N.; Van Weele, Brian

    2017-01-01

    The stochastic empirical loading and dilution model (SELDM) was used to demonstrate methods for estimating risks for water-quality exceedances of event-mean concentrations (EMCs) of total-copper. Monte Carlo methods were used to simulate stormflow, total-hardness, suspended-sediment, and total-copper EMCs as stochastic variables. These simulations were done for the Charles River Basin upstream of Interstate 495 in Bellingham, Massachusetts. The hydrology and water quality of this site were simulated with SELDM by using data from nearby, hydrologically similar sites. Three simulations were done to assess the potential effects of the highway on receiving-water quality with and without highway-runoff treatment by a structural best-management practice (BMP). In the low-development scenario, total copper in the receiving stream was simulated by using a sediment transport curve, sediment chemistry, and sediment-water partition coefficients. In this scenario, neither the highway runoff nor the BMP effluent caused concentration exceedances in the receiving stream that exceed the once in three-year threshold (about 0.54 percent). In the second scenario, without the highway, runoff from the large urban areas in the basin caused exceedances in the receiving stream in 2.24 percent of runoff events. In the third scenario, which included the effects of the urban runoff, neither the highway runoff nor the BMP effluent increased the percentage of exceedances in the receiving stream. Comparison of the simulated geometric mean EMCs with data collected at a downstream monitoring site indicates that these simulated values are within the 95-percent confidence interval of the geometric mean of the measured EMCs.

  4. Estimated Historical and Current Nitrogen Balances for Illinois

    Directory of Open Access Journals (Sweden)

    Mark B. David

    2001-01-01

    Full Text Available The Midwest has large riverine exports of nitrogen (N, with the largest flux per unit area to the Mississippi River system coming from Iowa and Illinois. We used historic and current data to estimate N inputs, outputs, and transformations for Illinois where human activity (principally agriculture and associated landscape drainage have had a dominant impact. Presently, ~800,000 Mg of N is added each year as fertilizer and another 420,000 Mg is biologically fixed, primarily by soybean (Glycine max L. Merr.. These annual inputs are greater than exports in grain, which results in surplus N throughout the landscape. Rivers within the state export approximately 50% of this surplus N, mostly as nitrate, and the remainder appears to be denitrified or temporarily incorporated into the soil organic matter pool. The magnitude of N losses for 1880, 1910, 1950, and 1990 are compared. Initial cultivation of the prairies released large quantities of N (~500,000 Mg N year�1, and resulted in riverine N transport during the late 19th century that appears to have been on the same order of magnitude as contemporary N losses. Riverine flux was estimated to have been at a minimum in about 1950, due to diminished net mineralization and low fertilizer inputs. Residual fertilizer N from corn (Zea mays L., biological N fixed by soybean, short-circuiting of soil water through artificial drainage, and decreased cropping-system diversity appear to be the primary sources for current N export.

  5. METHODOLOGICAL PROBLEMS OF PRACTICAL RADIOGENIC RISK ESTIMATIONS

    Directory of Open Access Journals (Sweden)

    A. Т. Gubin

    2014-01-01

    Full Text Available Mathematical ratios were established according to the description of the calculation procedure for the values of the nominal risk coefficient given in the ICRP Recommendations 2007. It is shown that the lifetime radiogenic risk is a linear functional from the distribution of the dose in time with a multiplier descending with age. As a consequence, application of the nominal risk coefficient in the risk calculations is justified in the case when prolonged exposure is practically evenly distributed in time, and gives a significant deviation at a single exposure. When using the additive model of radiogenic risk proposed in the UNSCEAR Report 2006 for solid cancers, this factor is almost linearly decreasing with the age, which is convenient for its practical application.

  6. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  7. Holding Period Return-Risk Modeling: Ambiguity in Estimation

    NARCIS (Netherlands)

    W.G.P.M. Hallerbach (Winfried)

    2003-01-01

    textabstractIn this paper we explore the theoretical and empirical problems of estimating average (excess) return and risk of US equities over various holding periods and sample periods. Our findings are relevant for performance evaluation, for estimating the historical equity risk premium, and for

  8. Holding Period Return-Risk Modeling: Ambiguity in Estimation

    NARCIS (Netherlands)

    W.G.P.M. Hallerbach (Winfried)

    2003-01-01

    textabstractIn this paper we explore the theoretical and empirical problems of estimating average (excess) return and risk of US equities over various holding periods and sample periods. Our findings are relevant for performance evaluation, for estimating the historical equity risk premium, and for

  9. Temporal evolution of risk estimates for presumed human teratogens.

    Science.gov (United States)

    Koebert, M K; Haun, J M; Pauli, R M

    1993-01-01

    We present preliminary data assessing a previously untried method of deriving estimates of risk from case reports on presumed human teratogens. We postulated that we could take advantage of biases inherent to case reports in order to generate one or more families of temporal curves that could be used to estimate the "true" risk of teratogenic exposure. Using this method (which we refer to as the "case-cumulative method") we found that two agents (parvovirus B19 and isotretinoin) demonstrated a logarithmic decrease in the estimated risk over time, as intuitively expected, while trimethadione and the coumarin derivatives showed a more complex pattern over time. Analysis of estimated risks quoted by reviews and large studies for these four agents showed large variability from estimate to estimate and no discernible temporal pattern. With further analysis of other agents, the case-cumulative method might eventually prove to be useful in teratogen counseling.

  10. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  11. Current Perspectives on Occupational Cancer Risks.

    Science.gov (United States)

    Boffetta; Kogevinas; Simonato; Wilbourn; Saracci

    1995-10-01

    On the basis of the International Agency for Research on Cancer's evaluations of occupational exposures, 22 occupational agents are classified as human carcinogens and an additional 22 agents as probable human carcinogens. In addition, evidence of increased risk of cancer was associated with particular industries and occupations, although no specific agents could be identified as etiologic factors. The main problem in the construction and interpretation of such lists is the lack of detailed qualitative and quantitative knowledge about exposures to known or suspected carcinogens. The recent examples of recognized occupational carcinogens, such as cadmium, beryllium, and ethylene oxide, stress the importance of the refinement in the methods for exposure assessment and for statistical analysis on the one hand and the potential benefits from the application of biomarkers of exposure and early effect on the other hand. Other trends that may be identified include the increasing practice of multicentric studies and investigations of exposures relevant to white collar workers and women. Finally, there is a need for investigation of occupational cancer risks in developing countries.

  12. Current methods for estimating the rate of photorespiration in leaves.

    Science.gov (United States)

    Busch, F A

    2013-07-01

    Photorespiration is a process that competes with photosynthesis, in which Rubisco oxygenates, instead of carboxylates, its substrate ribulose 1,5-bisphosphate. The photorespiratory metabolism associated with the recovery of 3-phosphoglycerate is energetically costly and results in the release of previously fixed CO2. The ability to quantify photorespiration is gaining importance as a tool to help improve plant productivity in order to meet the increasing global food demand. In recent years, substantial progress has been made in the methods used to measure photorespiration. Current techniques are able to measure multiple aspects of photorespiration at different points along the photorespiratory C2 cycle. Six different methods used to estimate photorespiration are reviewed, and their advantages and disadvantages discussed.

  13. Preoperative Evaluation: Estimation of Pulmonary Risk.

    Science.gov (United States)

    Lakshminarasimhachar, Anand; Smetana, Gerald W

    2016-03-01

    Postoperative pulmonary complications (PPCs) are common after major non-thoracic surgery and associated with significant morbidity and high cost of care. A number of risk factors are strong predictors of PPCs. The overall goal of the preoperative pulmonary evaluation is to identify these potential, patient and procedure-related risks and optimize the health of the patients before surgery. A thorough clinical examination supported by appropriate laboratory tests will help guide the clinician to provide optimal perioperative care. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Estimation of insurance premiums for coverage against natural disaster risk: an application of Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Y. Paudel

    2013-03-01

    Full Text Available This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.

  15. Impact of microbial count distributions on human health risk estimates

    DEFF Research Database (Denmark)

    Ribeiro Duarte, Ana Sofia; Nauta, Maarten

    2015-01-01

    -lognormal distribution. We show that the impact of the choice of different probability distributions to describe concentrations at retail on risk estimates is dependent both on concentration and prevalence levels. We also show that the use of an LOQ should be done consciously, especially when zero-inflation is not used...... on risk estimates, at two different concentration scenarios and at a range of prevalence levels. By using five different parametric distributions, we investigate whether different characteristics of a good fit are crucial for an accurate risk estimate. Among the factors studied are the importance......-inflated Poisson-lognormal distributed data and an existing QMRA model from retail to consumer level, it was possible to assess the difference between expected risk and the risk estimated with using a lognormal, a zero-inflated lognormal, a Poisson-gamma, a zero-inflated Poisson-gamma and a zero-inflated Poisson...

  16. Credit risk estimate using internal explicit knowledge

    Directory of Open Access Journals (Sweden)

    Abdallah Al-Shawabkeh

    2017-03-01

    Full Text Available Jordanian banks traditionally use a set of indicators, based on their internal explicit knowledge to examine the credit risk caused by default loans of individual borrowers. The banks are reliant on the personal and financial information of the borrowers, obtained by knowing them, often referred as internal explicit knowledge. Internal explicit knowledge characterizes both financial and non-financial indicators of individual borrowers, such as; loan amount, educational level, occupation, income, marital status, age, and gender. The authors studied 2755 default or non-performing personal loan profiles obtained from Jordanian Banks over a period of 1999 to 2014. The results show that low earning unemployed borrowers are very likely to default and contribute to non-performing loans by increasing the chances of credit risk. In addition, it is found that the unmarried, younger borrowers and moderate loan amount increase the probability of non-performing loans. On the contrary, borrowers employed in private sector and at least educated to a degree level are most likely to mitigate the credit risk. The study suggests improving the decision making process of Jordanian banks by making it more quantitative and dependable, instead of using only subjective or judgemental based understanding of borrowers.

  17. ESTIMATION OF RISK NEUTRAL MEASURE FOR POLISH STOCK MARKET

    Directory of Open Access Journals (Sweden)

    Paweł Kliber

    2014-08-01

    Full Text Available In the paper we present the application of risk neutral measure estimation in the analysis of the index WIG20 from Polish stock market. The risk neutral measure is calculated from the process of the options on that index. We assume that risk neutral measure is the mixture of lognormal distributions. The parameters of the distributions are estimated by minimizing the sum of squares of pricing errors. Obtained results are then compared with the model based on a single lognormal distribution. As an example we consider changes in risk neutral distribution at the beginning of March 2014, after the outbreak of political crisis in the Crimea.

  18. The New Frontier in Risk Assessment: Estimation of Corporate ...

    African Journals Online (AJOL)

    The New Frontier in Risk Assessment: Estimation of Corporate. Credit Rating ..... to signify the point in the business cycle during which a rating change was made. However, this ..... The Market for 'Lemons': Quality and Uncertainty in Market.

  19. Resources for global risk assessment: the International Toxicity Estimates for Risk (ITER) and Risk Information Exchange (RiskIE) databases.

    Science.gov (United States)

    Wullenweber, Andrea; Kroner, Oliver; Kohrman, Melissa; Maier, Andrew; Dourson, Michael; Rak, Andrew; Wexler, Philip; Tomljanovic, Chuck

    2008-11-15

    The rate of chemical synthesis and use has outpaced the development of risk values and the resolution of risk assessment methodology questions. In addition, available risk values derived by different organizations may vary due to scientific judgments, mission of the organization, or use of more recently published data. Further, each organization derives values for a unique chemical list so it can be challenging to locate data on a given chemical. Two Internet resources are available to address these issues. First, the International Toxicity Estimates for Risk (ITER) database (www.tera.org/iter) provides chronic human health risk assessment data from a variety of organizations worldwide in a side-by-side format, explains differences in risk values derived by different organizations, and links directly to each organization's website for more detailed information. It is also the only database that includes risk information from independent parties whose risk values have undergone independent peer review. Second, the Risk Information Exchange (RiskIE) is a database of in progress chemical risk assessment work, and includes non-chemical information related to human health risk assessment, such as training modules, white papers and risk documents. RiskIE is available at http://www.allianceforrisk.org/RiskIE.htm, and will join ITER on National Library of Medicine's TOXNET (http://toxnet.nlm.nih.gov/). Together, ITER and RiskIE provide risk assessors essential tools for easily identifying and comparing available risk data, for sharing in progress assessments, and for enhancing interaction among risk assessment groups to decrease duplication of effort and to harmonize risk assessment procedures across organizations.

  20. Thinking Concretely Increases the Perceived Likelihood of Risks: The Effect of Construal Level on Risk Estimation.

    Science.gov (United States)

    Lermer, Eva; Streicher, Bernhard; Sachs, Rainer; Raue, Martina; Frey, Dieter

    2016-03-01

    Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking. We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2). We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking. The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities. This suggests that CL manipulation can indeed be used for improving the accuracy of lay people's estimates of small and large probabilities. Moreover, the results suggest that professional risk managers' risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset. However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates' accuracy between lay people and risk managers are discussed.

  1. STATISTICAL APPROACH TO ESTIMATION OF LOGISTICAL RISKS OF INDUSTRIAL ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Maria G. Polikarpova

    2013-01-01

    Full Text Available The article offers the methodology of statistical research of the risks of logistical systems to improve the stability and efficiency of Russian industrial enterprises. Realization of this methodology is shown by the example of estimation of the risks of late shipment of goods of OAO Magnitogorsk Iron and Steel Works, JSC.

  2. SEMI PARAMETRIC ESTIMATION OF RISK-RETURN RELATIONSHIPS

    OpenAIRE

    Juan Carlos Escanciano; Juan Carlos Pardo-Fernández; Ingrid Van Keilegom

    2013-01-01

    This article proposes semi-parametric least squares estimation of parametric risk-return relationships, i.e. parametric restrictions between the conditional mean and the conditional variance of excess returns given a set of unobservable parametric factors. A distinctive feature of our estimator is that it does not require a parametric model for the conditional mean and variance. We establish consistency and asymptotic normality of the estimates. The theory is non-standard due to the presence ...

  3. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  4. A weighted genetic risk score using all known susceptibility variants to estimate rheumatoid arthritis risk

    Science.gov (United States)

    Yarwood, Annie; Han, Buhm; Raychaudhuri, Soumya; Bowes, John; Lunt, Mark; Pappas, Dimitrios A; Kremer, Joel; Greenberg, Jeffrey D; Plenge, Robert; Worthington, Jane; Barton, Anne; Eyre, Steve

    2015-01-01

    Background There is currently great interest in the incorporation of genetic susceptibility loci into screening models to identify individuals at high risk of disease. Here, we present the first risk prediction model including all 46 known genetic loci associated with rheumatoid arthritis (RA). Methods A weighted genetic risk score (wGRS) was created using 45 RA non-human leucocyte antigen (HLA) susceptibility loci, imputed amino acids at HLA-DRB1 (11, 71 and 74), HLA-DPB1 (position 9) HLA-B (position 9) and gender. The wGRS was tested in 11 366 RA cases and 15 489 healthy controls. The risk of developing RA was estimated using logistic regression by dividing the wGRS into quintiles. The ability of the wGRS to discriminate between cases and controls was assessed by receiver operator characteristic analysis and discrimination improvement tests. Results Individuals in the highest risk group showed significantly increased odds of developing anti-cyclic citrullinated peptide-positive RA compared to the lowest risk group (OR 27.13, 95% CI 23.70 to 31.05). The wGRS was validated in an independent cohort that showed similar results (area under the curve 0.78, OR 18.00, 95% CI 13.67 to 23.71). Comparison of the full wGRS with a wGRS in which HLA amino acids were replaced by a HLA tag single-nucleotide polymorphism showed a significant loss of sensitivity and specificity. Conclusions Our study suggests that in RA, even when using all known genetic susceptibility variants, prediction performance remains modest; while this is insufficiently accurate for general population screening, it may prove of more use in targeted studies. Our study has also highlighted the importance of including HLA variation in risk prediction models. PMID:24092415

  5. Validating diagnoses from hospital discharge registers change risk estimates for acute coronary syndrome

    DEFF Research Database (Denmark)

    Joensen, Albert Marni; Schmidt, Erik Berg; Dethlefsen, Claus

    2007-01-01

    of acute coronary syndrome (ACS) diagnoses identified in a hospital discharge register changed the relative risk estimates of well-established risk factors for ACS. Methods All first-time ACS diagnoses (n=1138) in the Danish National Patient Registry were identified among male participants in the Danish...... cohort study "Diet, Cancer and Health" (n=26 946). Medical records were retrieved and reviewed using current European Society of Cardiology criteria for ACS. The ACS diagnosis was confirmed in a total of 781 participants. Results The relative risk estimates of ACS for a range of well...

  6. Multivariate Risk-Return Decision Making Within Dynamic Estimation

    Directory of Open Access Journals (Sweden)

    Josip Arnerić

    2008-10-01

    Full Text Available Risk management in this paper is focused on multivariate risk-return decision making assuming time-varying estimation. Empirical research in risk management showed that the static "mean-variance" methodology in portfolio optimization is very restrictive with unrealistic assumptions. The objective of this paper is estimation of time-varying portfolio stocks weights by constraints on risk measure. Hence, risk measure dynamic estimation is used in risk controlling. By risk control manager makes free supplementary capital for new investments.Univariate modeling approach is not appropriate, even when portfolio returns are treated as one variable. Portfolio weights are time-varying, and therefore it is necessary to reestimate whole model over time. Using assumption of bivariate Student´s t-distribution, in multivariate GARCH(p,q models, it becomes possible to forecast time-varying portfolio risk much more precisely. The complete procedure of analysis is established from Zagreb Stock Exchange using daily observations of Pliva and Podravka stocks.

  7. Adult Cigarette Smoking in the United States: Current Estimates

    Science.gov (United States)

    ... of Reproductive Health More CDC Sites Current Cigarette Smoking Among Adults in the United States Recommend on ... reported smoking every day or some days. Current Smoking Among Adults in 2015 (Nation) By Gender 2 ...

  8. Estimating cancer risks to adults undergoing body CT examinations.

    Science.gov (United States)

    Huda, Walter; He, Wenjun

    2012-06-01

    The purpose of the study is to estimate cancer risks from the amount of radiation used to perform body computed tomography (CT) examination. The ImPACT CT Patient Dosimetry Calculator was used to compute values of organ doses for adult body CT examinations. The radiation used to perform each examination was quantified by the dose-length product (DLP). Patient organ doses were converted into corresponding age and sex dependent cancer risks using data from BEIR VII. Results are presented for cancer risks per unit DLP and unit effective dose for 11 sensitive organs, as well as estimates of the contribution from 'other organs'. For patients who differ from a standard sized adult, correction factors based on the patient weight and antero-posterior dimension are provided to adjust organ doses and the corresponding risks. At constant incident radiation intensity, for CT examinations that include the chest, risks for females are markedly higher than those for males, whereas for examinations that include the pelvis, risks in males were slightly higher than those in females. In abdominal CT scans, risks for males and female patients are very similar. For abdominal CT scans, increasing the patient age from 20 to 80 resulted in a reduction in patient risks of nearly a factor of 5. The average cancer risk for chest/abdomen/pelvis CT examinations was ∼26 % higher than the cancer risk caused by 'sensitive organs'. Doses and radiation risks in 80 kg adults were ∼10 % lower than those in 70 kg patients. Cancer risks in body CT can be estimated from the examination DLP by accounting for sex, age, as well as patient physical characteristics.

  9. Wind Plant Preconstruction Energy Estimates. Current Practice and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, Andrew [National Renewable Energy Lab. (NREL), Golden, CO (United States); Smith, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Fields, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-04-19

    Understanding the amount of energy that will be harvested by a wind power plant each year and the variability of that energy is essential to assessing and potentially improving the financial viability of that power plant. The preconstruction energy estimate process predicts the amount of energy--with uncertainty estimates--that a wind power plant will deliver to the point of revenue. This report describes the preconstruction energy estimate process from a technical perspective and seeks to provide insight into the financial implications associated with each step.

  10. Audit Risk Assessment in the Light of Current European Regulations

    OpenAIRE

    Ciprian-Costel Munteanu

    2015-01-01

    Recent European reforms on audit regulations have been motivated by efforts to increase audit quality, functioning and performance. We believe the adoption of Directive 2014/56 and Regulation 537/2014 strengthened the role of independent audit and risk committees, which will positively contribute towards audit quality. This paper aims to critically assess the status quo of audit risk assessment in current European standards and regulations, by conducting a theoretical analysis of ...

  11. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  12. Estimating urban flood risk - uncertainty in design criteria

    Science.gov (United States)

    Newby, M.; Franks, S. W.; White, C. J.

    2015-06-01

    The design of urban stormwater infrastructure is generally performed assuming that climate is static. For engineering practitioners, stormwater infrastructure is designed using a peak flow method, such as the Rational Method as outlined in the Australian Rainfall and Runoff (AR&R) guidelines and estimates of design rainfall intensities. Changes to Australian rainfall intensity design criteria have been made through updated releases of the AR&R77, AR&R87 and the recent 2013 AR&R Intensity Frequency Distributions (IFDs). The primary focus of this study is to compare the three IFD sets from 51 locations Australia wide. Since the release of the AR&R77 IFDs, the duration and number of locations for rainfall data has increased and techniques for data analysis have changed. Updated terminology coinciding with the 2013 IFD release has also resulted in a practical change to the design rainfall. For example, infrastructure that is designed for a 1 : 5 year ARI correlates with an 18.13% AEP, however for practical purposes, hydraulic guidelines have been updated with the more intuitive 20% AEP. The evaluation of design rainfall variation across Australia has indicated that the changes are dependent upon location, recurrence interval and rainfall duration. The changes to design rainfall IFDs are due to the application of differing data analysis techniques, the length and number of data sets and the change in terminology from ARI to AEP. Such changes mean that developed infrastructure has been designed to a range of different design criteria indicating the likely inadequacy of earlier developments to the current estimates of flood risk. In many cases, the under-design of infrastructure is greater than the expected impact of increased rainfall intensity under climate change scenarios.

  13. Risk Estimate of Diseases in Scale-Free Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANG Li-Jie; XU Xin-Jian

    2008-01-01

    We investigate the effect of risk estimate on the spread of diseases by the standard susceptible-infected-susceptible (SIS) model. The perception of the risk of being infected is explained as cutting off links among individuals, either healthy or infected. We study this simple dynamics on scaie-free networks by analytical methods and computer simulations. We obtain the self-consistency form for the infection prevalence in steady states. For a given transmission rate, there exists a linear relationship between the reciprocal of the density of infected nodes and the estimate parameter. We confirm all the results by sufficient numerical simulations.

  14. Harmonizing and comparing single-type natural hazard risk estimations

    Directory of Open Access Journals (Sweden)

    Kevin Fleming

    2016-05-01

    Full Text Available Single-type hazard and risk assessment is the usual framework followed by disaster risk reduction (DRR practitioners. There is therefore a need to present and compare the results arising from different hazard and risk types. Here we describe a simple method for combining risk curves arising from different hazard types in order to gain a first impression of the total risk. We show how the resulting total (and individual risk estimates can be examined and compared using so-called risk matrices, a format preferred by some DRR practitioners. We apply this approach to Cologne, Germany, which is subject to floods, windstorms and earthquakes. We then use a new series of risk calculations that consider epistemic uncertainty. The Mann-Whitney test is applied to determine if the losses arising from pairs of hazards are comparable for a given return period. This benefits decision makers as it allows a ranking of hazards with respect to expected damage. Such a comparison would assist planners in the allocation of resources towards the most efficient mitigation actions. However, the results are dependent upon the distribution of estimates (i.e., level of uncertainty, which is in turn a function of our state of knowledge.

  15. Effect size estimates: current use, calculations, and interpretation.

    Science.gov (United States)

    Fritz, Catherine O; Morris, Peter E; Richler, Jennifer J

    2012-02-01

    The Publication Manual of the American Psychological Association (American Psychological Association, 2001, American Psychological Association, 2010) calls for the reporting of effect sizes and their confidence intervals. Estimates of effect size are useful for determining the practical or theoretical importance of an effect, the relative contributions of factors, and the power of an analysis. We surveyed articles published in 2009 and 2010 in the Journal of Experimental Psychology: General, noting the statistical analyses reported and the associated reporting of effect size estimates. Effect sizes were reported for fewer than half of the analyses; no article reported a confidence interval for an effect size. The most often reported analysis was analysis of variance, and almost half of these reports were not accompanied by effect sizes. Partial η2 was the most commonly reported effect size estimate for analysis of variance. For t tests, 2/3 of the articles did not report an associated effect size estimate; Cohen's d was the most often reported. We provide a straightforward guide to understanding, selecting, calculating, and interpreting effect sizes for many types of data and to methods for calculating effect size confidence intervals and power analysis.

  16. Estimation of Cardiovascular Risk in Patients with Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Belkis Vicente Sánchez

    2015-09-01

    Full Text Available Background: diabetes mellitus accelerates atherosclerotic changes throughout the vascular tree and consequently increases the risk of developing fatal acute events. Objective: to estimate the global cardiovascular risk in patients with type 2 diabetes mellitus. Method: a cross-sectional study of a series of type 2 diabetic patients from the People's Council of Constancia, Abreus municipality, Cienfuegos province was conducted from July to December 2012. The universe comprised the 180 people with diabetes in the area. Variables studied were: age, sex, body mass index, nutritional assessment, blood pressure, toxic habits, associated chronic diseases, blood levels of glucose, lipids (total cholesterol and triglycerides and microalbuminuria. World Health Organization/International Society of Hypertension prediction charts specific to the region of the Americas, in which Cuba is included, were used to estimate the cardiovascular risk. Results: mean age was 61.63 years and females predominated. Relevant risk factors were hypertension followed by obesity, smoking and dyslipidemia. Mean body mass index was 27.66kg/m2; waist circumference was 94.45 cm in women and 96.86 cm in men. Thirty point six percent had more than two uncontrolled risk factors and 28.3 % of the total presented a high to very high cardiovascular risk. Conclusions: cardiovascular risk prediction charts are helpful tools for making clinical decisions, but their interpretation must be flexible and allow the intervention of clinical reasoning.

  17. Implementation of probabilistic risk estimation for VRU safety

    NARCIS (Netherlands)

    Nunen, E. van; Broek, T.H.A. van den; Kwakkernaat, M.R.J.A.E.; Kotiadis, D.

    2011-01-01

    This paper describes the design, implementation and results of a novel probabilistic collision warning system. To obtain reliable results for risk estimation, preprocessing sensor data is essential. The work described herein presents all the necessary preprocessing steps such as filtering, sensor fu

  18. Risk estimation in non-line-of-sight scenarios

    NARCIS (Netherlands)

    Nunen, E. van; Broek, T.H.A. van den; Wolferen, J. van

    2011-01-01

    In this paper, a collision warning application is described. The collision warning application is in particular useful when the driver’s view is blocked due to a large vehicle. The risk estimation algorithm, which is originally designed and validated in scenarios with vulnerable road users, is here

  19. Risk estimation in non-line-of-sight scenarios

    NARCIS (Netherlands)

    Nunen, E. van; Broek, T.H.A. van den; Wolferen, J. van

    2011-01-01

    In this paper, a collision warning application is described. The collision warning application is in particular useful when the driver’s view is blocked due to a large vehicle. The risk estimation algorithm, which is originally designed and validated in scenarios with vulnerable road users, is here

  20. Estimation and reduction of harmonic currents from power converters

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian

    Power Electronics is entering more and more products that inevitably increase the number of non-linear loads installed on the power system. The major concern of the non-linear loads is the emission of non-sinusoidal currents in the supply. Circulation of the harmonic currents in power systems...... creates losses, thus determining overrating of the power system. Furthermore, the harmonic currents cause harmonic voltage distortion, which is detrimental for all connected equipments to the power system, such as capacitors, ac-machines, control and protection equipments, measuring devices and electronic...... power supplies. Although their design takes into account a certain level of harmonic voltage distortion, there are many real-life cases when the equipments experience abnormal operation, malfunction or failure. One such case appeared at a local company in Denmark, a Heat Power Station where due...

  1. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  2. Neoplastic potential of gastric irradiation. IV. Risk estimates

    Energy Technology Data Exchange (ETDEWEB)

    Griem, M.L.; Justman, J.; Weiss, L.

    1984-12-01

    No significant tumor increase was found in the initial analysis of patients irradiated for peptic ulcer and followed through 1962. A preliminary study was undertaken 22 years later to estimate the risk of cancer due to gastric irradiation for peptic ulcer disease. A population of 2,049 irradiated patients and 763 medically managed patients has been identified. A relative risk of 3.7 was found for stomach cancer and an initial risk estimate of 5.5 x 10(-6) excess stomach cancers per person rad was calculated. A more complete follow-up is in progress to further elucidate this observation and decrease the ascertainment bias; however, preliminary data are in agreement with the Japanese atomic bomb reports.

  3. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    Science.gov (United States)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  4. Estimation of cancer risks and benefits associated with a potential increased consumption of fruits and vegetables.

    Science.gov (United States)

    Reiss, Richard; Johnston, Jason; Tucker, Kevin; DeSesso, John M; Keen, Carl L

    2012-12-01

    The current paper provides an analysis of the potential number of cancer cases that might be prevented if half the U.S. population increased its fruit and vegetable consumption by one serving each per day. This number is contrasted with an upper-bound estimate of concomitant cancer cases that might be theoretically attributed to the intake of pesticide residues arising from the same additional fruit and vegetable consumption. The cancer prevention estimates were derived using a published meta-analysis of nutritional epidemiology studies. The cancer risks were estimated using U.S. Environmental Protection Agency (EPA) methods, cancer potency estimates from rodent bioassays, and pesticide residue sampling data from the U.S. Department of Agriculture (USDA). The resulting estimates are that approximately 20,000 cancer cases per year could be prevented by increasing fruit and vegetable consumption, while up to 10 cancer cases per year could be caused by the added pesticide consumption. These estimates have significant uncertainties (e.g., potential residual confounding in the fruit and vegetable epidemiologic studies and reliance on rodent bioassays for cancer risk). However, the overwhelming difference between benefit and risk estimates provides confidence that consumers should not be concerned about cancer risks from consuming conventionally-grown fruits and vegetables.

  5. Estimating the risk of a scuba diving fatality in Australia.

    Science.gov (United States)

    Lippmann, John; Stevenson, Christopher; McD Taylor, David; Williams, Jo

    2016-12-01

    There are few data available on which to estimate the risk of death for Australian divers. This report estimates the risk of a scuba diving fatality for Australian residents, international tourists diving in Queensland, and clients of a large Victorian dive operator. Numerators for the estimates were obtained from the Divers Alert Network Asia-Pacific dive fatality database. Denominators were derived from three sources: Participation in Exercise, Recreation and Sport Surveys, 2001-2010 (Australian resident diving activity data); Tourism Research Australia surveys of international visitors to Queensland 2006-2014 and a dive operator in Victoria 2007-2014. Annual fatality rates (AFR) and 95% confidence intervals (95% CI) were calculated using an exact binomial test. Estimated AFRs were: 0.48 (0.37-0.59) deaths per 100,000 dives, or 8.73 (6.85-10.96) deaths per 100,000 divers for Australian residents; 0.12 (0.05-0.25) deaths per 100,000 dives, or 0.46 (0.20-0.91) deaths per 100,000 divers for international visitors to Queensland; and 1.64 (0.20-5.93) deaths per 100,000 dives for the dive operator in Victoria. On a per diver basis, Australian residents are estimated to be almost twenty times more likely to die whilst scuba diving than are international visitors to Queensland, or to lower than fourfold on a per dive basis. On a per dive basis, divers in Victoria are fourteen times more likely to die than are Queensland international tourists. Although some of the estimates are based on potentially unreliable denominator data extrapolated from surveys, the diving fatality rates in Australia appear to vary by State, being considerably lower in Queensland than in Victoria. These estimates are similar to or lower than comparable overseas estimates, although reliability of all such measurements varies with study size and accuracy of the data available.

  6. Method for Estimating Low-Frequency Return Current of DC Electric Railcar

    Science.gov (United States)

    Hatsukade, Satoru

    The Estimation of the harmonic current of railcars is necessary for achieving compatibility between train signaling systems and railcar equipment. However, although several theoretical analyses methods for estimating the harmonic current of railcars using switching functions exist, there are no theoretical analysis methods estimating a low-frequency current at a frequency less than the power converter's carrier frequency. This paper describes a method for estimating the spectrum (frequency and amplitude) of the low-frequency return current of DC electric railcars. First, relationships between the return current and characteristics of the DC electric railcars, such as mass and acceleration, are determined. Then, the mathematical (not numerical) calculation results for low-frequency current are obtained from the time-current curve for a DC electric railcar by using Fourier series expansions. Finally, the measurement results clearly show the effectiveness of the estimation method development in this study.

  7. Audit Risk Assessment in the Light of Current European Regulations

    Directory of Open Access Journals (Sweden)

    Ciprian-Costel Munteanu

    2015-06-01

    Full Text Available Recent European reforms on audit regulations have been motivated by efforts to increase audit quality, functioning and performance. We believe the adoption of Directive 2014/56 and Regulation 537/2014 strengthened the role of independent audit and risk committees, which will positively contribute towards audit quality. This paper aims to critically assess the status quo of audit risk assessment in current European standards and regulations, by conducting a theoretical analysis of different aspects of audit risk. Our main objective is to stress the importance of detecting inherent and control risk, which lead to material misstatement at the assertion level. They need to be assessed so as to determine the nature, timing and extent of further audit procedures necessary to obtain sufficient appropriate audit evidence. These pieces of evidence enable the auditor to express an opinion on the financial statements at an acceptably low level of audit risk. Therefore, we point to the fact that researchers as well as practitioners and policymakers have to be careful when using audit tools and assessing risk levels, as their conclusions continuously shape the regulations.

  8. Global uniform risk bounds for wavelet deconvolution estimators

    CERN Document Server

    Lounici, Karim; 10.1214/10-AOS836

    2011-01-01

    We consider the statistical deconvolution problem where one observes $n$ replications from the model $Y=X+\\epsilon$, where $X$ is the unobserved random signal of interest and $\\epsilon$ is an independent random error with distribution $\\phi$. Under weak assumptions on the decay of the Fourier transform of $\\phi,$ we derive upper bounds for the finite-sample sup-norm risk of wavelet deconvolution density estimators $f_n$ for the density $f$ of $X$, where $f:\\mathbb{R}\\to \\mathbb{R}$ is assumed to be bounded. We then derive lower bounds for the minimax sup-norm risk over Besov balls in this estimation problem and show that wavelet deconvolution density estimators attain these bounds. We further show that linear estimators adapt to the unknown smoothness of $f$ if the Fourier transform of $\\phi$ decays exponentially and that a corresponding result holds true for the hard thresholding wavelet estimator if $\\phi$ decays polynomially. We also analyze the case where $f$ is a "supersmooth"/analytic density. We finall...

  9. Associations between depression risk, bullying and current smoking among Chinese adolescents: Modulated by gender.

    Science.gov (United States)

    Guo, Lan; Hong, Lingyao; Gao, Xue; Zhou, Jinhua; Lu, Ciyong; Zhang, Wei-Hong

    2016-03-30

    This school-based study aimed to investigate the prevalence of being at risk for depression, bullying behavior, and current smoking among Chinese adolescents in order to explore gender differences in the vulnerability of adolescents with these behaviors to develop a smoking habit. A total of 35,893 high school students sampled from high schools in eighteen cities in China participated in the study from 2011 to 2012. Overall, the prevalence of current smoking was estimated at 6.4%. In total, 1.7% (618) of the participants admitted to bullying others, 5.8% (2071) reported being bullied, 3.5% (1269) were involved in both bullying others and being bullied, and 5.6% (2017) were at high risk for depression. Logistic regression analysis indicated that among girls, with high depression risk, bullying others, being bullied, and both bullying others and being bullied were independently and positively associated with current smoking habits, while the final results among boys showed that bullying others and both bullying others and being bullied were independently associated with an increased risk of current smoking. School-based prevention programs are highly recommended, and we should focus on high-risk students, particularly girls with high risk of depression or involved in school bullying and boys who are involved in school bullying.

  10. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    Science.gov (United States)

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  11. Estimating radiation risk induced by CT screening for Korean population

    Science.gov (United States)

    Yang, Won Seok; Yang, Hye Jeong; Min, Byung In

    2017-02-01

    The purposes of this study are to estimate the radiation risks induced by chest/abdomen computed tomography (CT) screening for healthcare and to determine the cancer risk level of the Korean population compared to other populations. We used an ImPACT CT Patient Dosimetry Calculator to compute the organ effective dose induced by CT screening (chest, low-dose chest, abdomen/pelvis, and chest/abdomen/pelvis CT). A risk model was applied using principles based on the BEIR VII Report in order to estimate the lifetime attributable risk (LAR) using the Korean Life Table 2010. In addition, several countries including Hong Kong, the United States (U.S.), and the United Kingdom, were selected for comparison. Herein, each population exposed radiation dose of 100 mSv was classified according to country, gender and age. For each CT screening the total organ effective dose calculated by ImPACT was 6.2, 1.5, 5.2 and 11.4 mSv, respectively. In the case of Korean female LAR, it was similar to Hong Kong female but lower than those of U.S. and U.K. females, except for those in their twenties. The LAR of Korean males was the highest for all types of CT screening. However, the difference of the risk level was negligible because of the quite low value.

  12. Radiation-Induced Second Cancer Risk Estimates From Radionuclide Therapy

    Science.gov (United States)

    Bednarz, Bryan; Besemer, Abigail

    2017-09-01

    The use of radionuclide therapy in the clinical setting is expected to increase significantly over the next decade. There is an important need to understand the radiation-induced second cancer risk associated with these procedures. In this study the radiation-induced cancer risk in five radionuclide therapy patients was investigated. These patients underwent serial SPECT imaging scans following injection as part of a clinical trial testing the efficacy of a 131Iodine-labeled radiopharmaceutical. Using these datasets the committed absorbed doses to multiple sensitive structures were calculated using RAPID, which is a novel Monte Carlo-based 3D dosimetry platform developed for personalized dosimetry. The excess relative risk (ERR) for radiation-induced cancer in these structures was then derived from these dose estimates following the recommendations set forth in the BEIR VII report. The radiation-induced leukemia ERR was highest among all sites considered reaching a maximum value of approximately 4.5. The radiation-induced cancer risk in the kidneys, liver and spleen ranged between 0.3 and 1.3. The lifetime attributable risks (LARs) were also calculated, which ranged from 30 to 1700 cancers per 100,000 persons and were highest for leukemia and the liver for both males and females followed by radiation-induced spleen and kidney cancer. The risks associated with radionuclide therapy are similar to the risk associated with external beam radiation therapy.

  13. A Review of Expertise and Judgment Processes for Risk Estimation

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    A major challenge of risk and reliability analysis for human errors or hardware failures is the need to enlist expert opinion in areas for which adequate operational data are not available. Experts enlisted in this capacity provide probabilistic estimates of reliability, typically comprised of a measure of central tendency and uncertainty bounds. While formal guidelines for expert elicitation are readily available, they largely fail to provide a theoretical basis for expertise and judgment. This paper reviews expertise and judgment in the context of risk analysis; overviews judgment biases, the role of training, and multivariate judgments; and provides guidance on the appropriate use of atomistic and holistic judgment processes.

  14. Measurement of total risk of spontaneous abortion: the virtue of conditional risk estimation

    DEFF Research Database (Denmark)

    Modvig, J; Schmidt, L; Damsgaard, M T

    1990-01-01

    abortion risk include biochemical assays as well as life table technique, although the latter appears in two different forms. The consequences of using either of these are discussed. It is concluded that no study design so far is appropriate for measuring the total risk of spontaneous abortion from early...... conception to the end of the 27th week. It is proposed that pregnancy may be considered to consist of two or three specific periods and that different study designs should concentrate on measuring the conditional risk within each period. A careful estimate using this principle leads to an estimate of total...

  15. The Behavioral Risk Factor Survey and the Stanford Five-City Project Survey: a comparison of cardiovascular risk behavior estimates.

    Science.gov (United States)

    Jackson, C; Jatulis, D E; Fortmann, S P

    1992-01-01

    BACKGROUND. Nearly all state health departments collect Behavioral Risk Factor Survey (BRFS) data, and many report using these data in public health planning. Although the BRFS is widely used, little is known about its measurement properties. This study compares the cardiovascular risk behavior estimates of the BRFS with estimates derived from the physiological and interview data of the Stanford Five-City Project Survey (FCPS). METHOD. The BRFS is a random telephone sample of 1588 adults aged 25 to 64; the FCPS is a random household sample of 1512 adults aged 25 to 64. Both samples were drawn from the same four California communities. RESULTS. The surveys produced comparable estimates for measures of current smoking, number of cigarettes smoked per day, rate of ever being told one has high blood pressure, rate of prescription of blood pressure medications, compliance in taking medications, and mean total cholesterol. Significant differences were found for mean body mass index, rates of obesity, and, in particular, rate of controlled hypertension. CONCLUSIONS. These differences indicate that, for some risk variables, the BRFS has limited utility in assessing public health needs and setting public health objectives. A formal validation study is needed to test all the risk behavior estimates measured by this widely used instrument. PMID:1536358

  16. Estimating relative risks for common outcome using PROC NLP.

    Science.gov (United States)

    Yu, Binbing; Wang, Zhuoqiao

    2008-05-01

    In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.

  17. Estimating twin concordance for bivariate competing risks twin data

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Holst, Klaus Kähler; von Bornemann Hjelmborg, Jacob

    2014-01-01

    For twin time-to-event data, we consider different concordance probabilities, such as the casewise concordance that are routinely computed as a measure of the lifetime dependence/correlation for specific diseases. The concordance probability here is the probability that both twins have experienced...... over time, and covariates may be further influential on the marginal risk and dependence structure. We establish the estimators large sample properties and suggest various tests, for example, for inferring familial influence. The method is demonstrated and motivated by specific twin data on cancer...... the event of interest. Under the assumption that both twins are censored at the same time, we show how to estimate this probability in the presence of right censoring, and as a consequence, we can then estimate the casewise twin concordance. In addition, we can model the magnitude of within pair dependence...

  18. Estimating Risk and Return Combinations for New Derivatives Funds

    Directory of Open Access Journals (Sweden)

    Alexandre Bona

    2004-12-01

    Full Text Available Active funds are typically managed by placing bets against a well defined passive bench-mark. In this context, when examining the launching of a new actively managed fund with a target expected excess rate of return relative to the benchmark equal to µ, asset managers face the problem of estimating the risk σ of excess rates of return. This estimate is critical to examine whether the product is commercially feasible and to define risk limits for the manager, if the product is launched. This paper proceeds to examine the solution to this problem assuming an especial form of the binomial model, in the context of the market timing structure advanced by Merton (1981. The paper shows that two variables are relevant for the solution of the proposed problem. The first, and the most relevant, is the skill level of the manager. A ore skilled manager is able to operate a less risky product with the same target excess rate of return µ. The second relevant variable is the trade-off between risk and return determined by existing investment opportunities in the market. The smaller the increases in risk exposure required to obtain an increase in excess returns, the less risky the product will be After solving the problem under specific assumptions, the paper proceeds to test empirically their validity using a representative sample of hedge funds in the Brazilian market. The empirical results strongly support the validity of the required assumptions.

  19. Current Research Status of KHNP for Site Risk

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kyemin; Jeon, Ho-Jun; Bahng, Ki-In; Na, Jang-Hwan [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In Korea, by the geographical characteristics, many Nuclear Power Plants (NPPs) have been constructed and operated at a single site. This is above average level or number of plants per site in the world. For this reason, the public concerns for the safety of nuclear facilities increased after Fukushima Daiichi accident. As a result, comprehensive risk assessment and management for the site which have multi-unit NPPs were strongly asked. Currently, to solve it, many researches and projects has carried out by various Korean companies, research centers, and regulatory authorities. In this paper, R and D plans of KHNP for multi-unit risk were summarized. Firstly, the needs of multi-unit PSA were reviewed. R and D activities and plans of KHNP were summarized in the last part. In this paper, we summarized the R and D plans of KHNP for assessing the multi-unit risk. Currently, multi-unit risk or multi-unit PSA are important and practical issues in both nuclear industry and national energy policy. After Fukushima accident, several countries stopped the construction and the operation of NPPs, other countries which is maintaining the NPPs are being strongly asked to assess the risk for multi-unit NPPs at the same site. Because of Korean geographical characteristics, the number of NPPs which are above average level or number of plants per site in the world is being constructed and operated at a single site. The population density nearby each site is considered to be higher than that of other countries.

  20. Impact of UKPDS risk estimation added to a first subjective risk estimation on management of coronary disease risk in type 2 diabetes - An observational study

    NARCIS (Netherlands)

    Wind, Anne E.; Gorter, Kees J.; Van Den Donk, Maureen; Rutten, Guy E H M

    2016-01-01

    Aims To investigate the impact of the UKPDS risk engine on management of CHD risk in T2DM patients. Methods Observational study among 139 GPS. Data from 933 consecutive patients treated with a maximum of two oral glucose lowering drugs, collected at baseline and after twelve months. GPS estimated

  1. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    Science.gov (United States)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  2. How Many Significant Figures are Useful for Public Risk Estimates?

    Science.gov (United States)

    Wilde, Paul D.; Duffy, Jim

    2013-09-01

    This paper considers the level of uncertainty in the calculation of public risks from launch or reentry and provides guidance on the number of significant digits that can be used with confidence when reporting the analysis results to decision-makers. The focus of this paper is the uncertainty in collective risk calculations that are used for launches of new and mature ELVs. This paper examines the computational models that are used to estimate total collective risk to the public for a launch, including the model input data and the model results, and characterizes the uncertainties due to both bias and variability. There have been two recent efforts to assess the uncertainty in state-of-the-art risk analysis models used in the US and their input data. One assessment focused on launch area risk from an Atlas V at Vandenberg Air Force Base (VAFB) and the other focused on downrange risk to Eurasia from a Falcon 9 launched from Cape Canaveral Air Force Station (CCAFS). The results of these studies quantified the uncertainties related to both the probability and the consequence of the launch debris hazards. This paper summarizes the results of both of these relatively comprehensive launch risk uncertainty analyses, which addressed both aleatory and epistemic uncertainties. The epistemic uncertainties of most concern were associated with probability of failure and the debris list. Other major sources of uncertainty evaluated were: the casualty area for people in shelters that are impacted by debris, impact distribution size, yield from exploding propellant and propellant tanks, probability of injury from a blast wave for people in shelters or outside, and population density. This paper also summarizes a relatively comprehensive over-flight risk uncertainty analysis performed by the FAA for the second stage of flight for a Falcon 9 from CCAFS. This paper is applicable to baseline collective risk analyses, such as those used to make a commercial license determination, and

  3. Cancer Risk Estimates from Space Flight Estimated Using Yields of Chromosome Damage in Astronaut's Blood Lymphocytes

    Science.gov (United States)

    George, Kerry A.; Rhone, J.; Chappell, L. J.; Cucinotta, F. A.

    2011-01-01

    To date, cytogenetic damage has been assessed in blood lymphocytes from more than 30 astronauts before and after they participated in long-duration space missions of three months or more on board the International Space Station. Chromosome damage was assessed using fluorescence in situ hybridization whole chromosome analysis techniques. For all individuals, the frequency of chromosome damage measured within a month of return from space was higher than their preflight yield, and biodosimetry estimates were within the range expected from physical dosimetry. Follow up analyses have been performed on most of the astronauts at intervals ranging from around 6 months to many years after flight, and the cytogenetic effects of repeat long-duration missions have so far been assessed in four individuals. Chromosomal aberrations in peripheral blood lymphocytes have been validated as biomarkers of cancer risk and cytogenetic damage can therefore be used to characterize excess health risk incurred by individual crewmembers after their respective missions. Traditional risk assessment models are based on epidemiological data obtained on Earth in cohorts exposed predominantly to acute doses of gamma-rays, and the extrapolation to the space environment is highly problematic, involving very large uncertainties. Cytogenetic damage could play a key role in reducing uncertainty in risk estimation because it is incurred directly in the space environment, using specimens from the astronauts themselves. Relative cancer risks were estimated from the biodosimetry data using the quantitative approach derived from the European Study Group on Cytogenetic Biomarkers and Health database. Astronauts were categorized into low, medium, or high tertiles according to their yield of chromosome damage. Age adjusted tertile rankings were used to estimate cancer risk and results were compared with values obtained using traditional modeling approaches. Individual tertile rankings increased after space

  4. Estimating twin concordance for bivariate competing risks twin data.

    Science.gov (United States)

    Scheike, Thomas H; Holst, Klaus K; Hjelmborg, Jacob B

    2014-03-30

    For twin time-to-event data, we consider different concordance probabilities, such as the casewise concordance that are routinely computed as a measure of the lifetime dependence/correlation for specific diseases. The concordance probability here is the probability that both twins have experienced the event of interest. Under the assumption that both twins are censored at the same time, we show how to estimate this probability in the presence of right censoring, and as a consequence, we can then estimate the casewise twin concordance. In addition, we can model the magnitude of within pair dependence over time, and covariates may be further influential on the marginal risk and dependence structure. We establish the estimators large sample properties and suggest various tests, for example, for inferring familial influence. The method is demonstrated and motivated by specific twin data on cancer events with the competing risk death. We thus aim to quantify the degree of dependence through the casewise concordance function and show a significant genetic component.

  5. Nonlinear Memory and Risk Estimation in Financial Records

    Science.gov (United States)

    Bunde, Armin; Bogachev, Mikhail I.

    It is well known that financial data sets are multifractal and governed by nonlinear correlations. Here we are interested in the daily returns of a financial asset and in the way the occurrence of large gains or losses is triggered by the nonlinear memory. To this end, we study the statistics of the return intervals between gains (or losses) above a certain threshold Q. In the case of i.i.d. random numbers the probability density function (pdf) of the return intervals decays exponentially and the return intervals are uncorrelated. Here we show that the nonlinear correlations lead to a power law decay of the pdf and linear long-term correlations between the return intervals that are described by a power-law decay of the corresponding autocorrelation function. From the pdf of the return intervals one obtains the risk function W Q (t; Δt), which is the probability that within the next Δt units of time at least one event above Q occurs, if the last event occurred t time units ago. We propose an analytical estimate of W Q and show explicitly that the proposed method is superior to the conventional precursory pattern recognition technique widely used in signal analysis, which requires considerable fine-tuning and is difficult to implement. We also show that the estimation of the Value at Risk, which is a standard tool in finances, can be improved considerably compared with previous estimates.

  6. Binomial Distribution Sample Confidence Intervals Estimation 6. Excess Risk

    Directory of Open Access Journals (Sweden)

    Sorana BOLBOACĂ

    2004-02-01

    Full Text Available We present the problem of the confidence interval estimation for excess risk (Y/n-X/m fraction, a parameter which allows evaluating of the specificity of an association between predisposing or causal factors and disease in medical studies. The parameter is computes based on 2x2 contingency table and qualitative variables. The aim of this paper is to introduce four new methods of computing confidence intervals for excess risk called DAC, DAs, DAsC, DBinomial, and DBinomialC and to compare theirs performance with the asymptotic method called here DWald.In order to assess the methods, we use the PHP programming language and a PHP program was creates. The performance of each method for different sample sizes and different values of binomial variables were assess using a set of criterions. First, the upper and lower boundaries for a given X, Y and a specified sample size for choused methods were compute. Second, were assessed the average and standard deviation of the experimental errors, and the deviation relative to imposed significance level α = 5%. Four methods were assessed on random numbers for binomial variables and for sample sizes from 4 to 1000 domain.The experiments show that the DAC methods obtain performances in confidence intervals estimation for excess risk.

  7. Risk estimates of liver cancer due to aflatoxin exposure from peanuts and peanut products

    Energy Technology Data Exchange (ETDEWEB)

    Dichter, C.R.

    1984-06-01

    An assessment was undertaken of the risk of liver cancer in the USA associated with aflatoxin ingestion from peanuts. Both laboratory-animal data and epidemiological data collected from the scientific literature and several prominent mathematical extrapolation techniques were used. Risk estimates differed by a factor of greater than 1000 when the extrapolated results of three selected animal studies were analysed. Dose-response data for the male Fischer rat, the most sensitive mammalian species studied, produced an estimate of 158 cases of liver cancer per year in the USA at current levels of aflatoxin exposure. An estimate of 58 annual cases was predicted on the basis of epidemiological data of populations in Africa and Thailand.

  8. China's Current Real Estate Cycle and Potential Financial Risks

    Institute of Scientific and Technical Information of China (English)

    Xiaojing Zhang; Tao Sun

    2006-01-01

    The real estate cycle and financial stability are closely correlated. In light of global real estate bubbles, China's real estate cycle has attracted wide attention since 1998. The present paper analyzes three driving factors in the context of the current real estate cycle; namely,economic growth, macroeconomic environment and institutional establishment. Supported by econometric analysis using quarterly data from 1992-2004, the present paper indicates that real estate will develop steadily and that housing prices will consistently rise in the relative long run. Based on quantitative analysis, it is concluded that the implications of the current real estate cycle for financial stability include risks of real estate credit exposure,government guarantees and maturity mismatch. Some corresponding policy implications are discussed, such as advancing banking reform, encouraging the rational behavior of local governments and strengthening the regulation of foreign capital flows in and out of China's real estate industry.

  9. Metabolic risk-factor clustering estimation in children: to draw a line across pediatric metabolic syndrome

    DEFF Research Database (Denmark)

    Brambilla, P; Lissau, I; Flodmark, C-E

    2007-01-01

    BACKGROUND: The diagnostic criteria of the metabolic syndrome (MS) have been applied in studies of obese adults to estimate the metabolic risk-associated with obesity, even though no general consensus exists concerning its definition and clinical value. We reviewed the current literature on the MS...... and adolescents, analyzing the scientific evidence needed to detect a clustering of cardiovascular risk-factors. Finally, we propose a new methodological approach for estimating metabolic risk-factor clustering in children and adolescents. RESULTS: Major concerns were the lack of information on the background...... derived from a child's family and personal history; the lack of consensus on insulin levels, lipid parameters, markers of inflammation or steato-hepatitis; the lack of an additive relevant effect of the MS definition to obesity per se. We propose the adoption of 10 evidence-based items from which...

  10. Estimating risk aversion, Risk-Neutral and Real-World Densities using Brazilian Real currency options

    Directory of Open Access Journals (Sweden)

    José Fajardo

    2012-12-01

    Full Text Available This paper uses the Liu et al. (2007 approach to estimate the optionimplied Risk-Neutral Densities (RND, real-world density (RWD, and relative risk aversion from the Brazilian Real/US Dollar exchange rate distribution. Our empirical application uses a sample of exchange-traded Brazilian Real currency options from 1999 to 2011. Our estimated value of the relative risk aversion is around 2.7, which is in line with other articles for the Brazilian Economy. Our out-of-sample results showed that the RND has some ability to forecast the Brazilian Real exchange rate, but when we incorporate the risk aversion, the out-of-sample performance improves substantially.

  11. Data Sources for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    The model-based estimates of important cancer risk factors and screening behaviors are obtained by combining the responses to the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS).

  12. Global Building Inventory for Earthquake Loss Estimation and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  13. Estimation of Hail Risk in the UK and Europe

    Science.gov (United States)

    Robinson, Eric; Parker, Melanie; Higgs, Stephanie

    2016-04-01

    Observations of hail events in Europe, and the UK especially, are relatively limited. In order to determine hail risk it is therefore necessary to use information other than relying purely on the historical record. One such methodology is to leverage reanalysis data, in this case ERA-Interim, along with a numerical model (WRF) to recreate the past state of the atmosphere. Relevant atmospheric properties can be extracted and used in a regression model to determine hail probability for each day contained within the reanalyses. The results presented here show the results of using a regression model based on convective available potential energy, deep level shear and weather type. Combined these parameters represent the probability of severe thunderstorm, and in turn hail, activity. Once the probability of hail occurring on each day is determined this can be used as the basis of a stochastic catalogue which can be used in the estimation of hail risk.

  14. Current directions in screening-level ecological risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Carlsen, T M; Efroymson, R A

    2000-12-11

    Ecological risk assessment (ERA) is a tool used by many regulatory agencies to evaluate the impact to ecological receptors from changes in environmental conditions. Widespread use of ERAs began with the United States Environmental Protection Agency's Superfund program to assess the ecological impact from hazardous chemicals released to the environment. Many state hazardous chemical regulatory agencies have adopted the use of ERAs, and several state regulatory agencies are evaluating the use of ERAs to assess ecological impacts from releases of petroleum and gas-related products. Typical ERAs are toxicologically-based, use conservative assumptions with respect to ecological receptor exposure duration and frequency, often require complex modeling of transport and exposure and are very labor intensive. In an effort to streamline the ERA process, efforts are currently underway to develop default soil screening levels, to identify ecological screening criteria for excluding sites from formal risk assessment, and to create risk-based corrective action worksheets. This should help reduce the time spent on ERAs, at least for some sites. Work is also underway to incorporate bioavailability and spatial considerations into ERAs. By evaluating the spatial nature of contaminant releases with respect to the spatial context of the ecosystem under consideration, more realistic ERAs with respect to the actual impact to ecological receptors at the population, community or ecosystem scale should be possible. In addition, by considering the spatial context, it should be possible to develop mitigation and monitoring efforts to more appropriately address such sites within the context of an ecological framework.

  15. Risk Estimation with Epidemiologic Data When Response Attenuates at High-Exposure Levels

    National Research Council Canada - National Science Library

    Kyle Steenland; Ryan Seals; Mitch Klein; Jennifer Jinot; Henry D. Kahn

    2011-01-01

    Background: In occupational studies, which are commonly used for risk assessment for environmental settings, estimated exposure-response relationships often attenuate at high exposures. Relative risk (RR...

  16. Risk-based surveillance: Estimating the effect of unwarranted confounder adjustment

    DEFF Research Database (Denmark)

    Willeberg, Preben; Nielsen, Liza Rosenbaum; Salman, Mo

    2011-01-01

    We estimated the effects of confounder adjustment as a part of the underlying quantitative risk assessments on the performance of a hypothetical example of a risk-based surveillance system, in which a single risk factor would be used to identify high risk sampling units for testing. The differences...... between estimates of surveillance system performance with and without unwarranted confounder adjustment were shown to be of both numerical and economical significance. Analytical procedures applied to multiple risk factor datasets which yield confounder-adjusted risk estimates should be carefully...... considered for their appropriateness, if the risk estimates are to be used for informing risk-based surveillance systems....

  17. Risk management of seasonal influenza during pregnancy: current perspectives

    Science.gov (United States)

    Yudin, Mark H

    2014-01-01

    Influenza poses unique risks to pregnant women, who are particularly susceptible to morbidity and mortality. Historically, pregnant women have been overrepresented among patients with severe illness and complications from influenza, and have been more likely to require hospitalization and intensive care unit admission. An increased risk of adverse outcomes is also present for fetuses/neonates born to women affected by influenza during pregnancy. These risks to mothers and babies have been observed during both nonpandemic and pandemic influenza seasons. During the H1N1 influenza pandemic of 2009–2010, pregnant women were more likely to be hospitalized or admitted to intensive care units, and were at higher risk of death compared to nonpregnant adults. Vaccination remains the most effective intervention to prevent severe illness, and antiviral medications are an important adjunct to ameliorate disease when it occurs. Unfortunately, despite national guidelines recommending universal vaccination for women who are pregnant during influenza season, actual vaccination rates do not achieve desired targets among pregnant women. Pregnant women are also sometimes reluctant to use antiviral medications during pregnancy. Some of the barriers to use of vaccines and medications during pregnancy are a lack of knowledge of recommendations and of safety data. By improving knowledge and understanding of influenza and vaccination recommendations, vaccine acceptance rates among pregnant women can be improved. Currently, the appropriate use of vaccination and antiviral medications is the best line of defense against influenza and its sequelae among pregnant women, and strategies to increase acceptance are crucial. This article will review the importance of influenza in pregnancy, and discuss vaccination and antiviral medications for pregnant women. PMID:25114593

  18. Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation

    CERN Document Server

    Vaiter, Samuel; Peyré, Gabriel; Dossal, Charles; Fadili, Jalal

    2012-01-01

    This paper studies the recovery of an unknown signal $x_0$ from low dimensional noisy observations $y = \\Phi x_0 + w$, where $\\Phi$ is an ill-posed linear operator and $w$ accounts for some noise. We focus our attention to sparse analysis regularization. The recovery is performed by minimizing the sum of a quadratic data fidelity term and the $\\lun$-norm of the correlations between the sought after signal and atoms in a given (generally overcomplete) dictionary. The $\\lun$ prior is weighted by a regularization parameter $\\lambda > 0$ that accounts for the noise level. In this paper, we prove that minimizers of this problem are piecewise-affine functions of the observations $y$ and the regularization parameter $\\lambda$. As a byproduct, we exploit these properties to get an objectively guided choice of $\\lambda$. More precisely, we propose an extension of the Generalized Stein Unbiased Risk Estimator (GSURE) and show that it is an unbiased estimator of an appropriately defined risk. This encompasses special ca...

  19. Estimating Heat and Mass Transfer Processes in Green Roof Systems: Current Modeling Capabilities and Limitations (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Tabares Velasco, P. C.

    2011-04-01

    This presentation discusses estimating heat and mass transfer processes in green roof systems: current modeling capabilities and limitations. Green roofs are 'specialized roofing systems that support vegetation growth on rooftops.'

  20. The Management Standards Indicator Tool and the estimation of risk.

    Science.gov (United States)

    Bevan, A; Houdmont, J; Menear, N

    2010-10-01

    The Health & Safety Executive's (HSE) Indicator Tool offers a measure of exposure to psychosocial work conditions that may be linked to stress-related outcomes. The HSE recommends that Indicator Tool data should be used as a basis for discussions concerned with the identification of psychosocial work conditions that might warrant prioritization for intervention. However, operational constraints may render discussions difficult to convene and, when they do take place, the absence of information on harms associated with exposures can make it difficult to identify intervention priorities. To examine (i) the utility of the Indicator Tool for the identification of a manageable number of psychosocial work conditions as intervention candidates and (ii) whether administration of a measure of stress-related outcomes alongside the Indicator Tool can facilitate the identification of intervention priorities. One thousand and thirty-eight employees in the London region of the Her Majesty's Prison Service completed the Indicator Tool and a measure of psychological well-being. Odds ratios were calculated to estimate the risk of impairment to well-being associated with exposure to psychosocial work conditions. The Indicator Tool identified 34 psychosocial work conditions as warranting improvement. Intervention priority was given to those working conditions that were both reported to be poor by ≥50% of respondents and associated with risk of impairment to well-being. This method allowed for the identification of four areas. Augmentation of the Indicator Tool with a measure of stress-related outcomes and the calculation of simple risk estimation statistics can assist the prioritization of intervention candidates.

  1. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  2. Micro-scale flood risk estimation in historic centres: a case study in Florence, Italy

    Science.gov (United States)

    Castelli, Fabio; Arrighi, Chiara; Brugioni, Marcello; Franceschini, Serena; Mazzanti, Bernardo

    2013-04-01

    The route to flood risk assessment is much more than hydraulic modelling of inundation, that is hazard mapping. Flood risk is the product of flood hazard, vulnerability and exposure, all the three to be estimated with comparable level of accuracy. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. Currently one of the main challenges in flood damage estimation resides in the scarce availability of socio-economic data characterizing the monetary value of the exposed assets. When these public-open data are available, the variability of their level of detail drives the need of merging different sources and of selecting an appropriate scale of analysis. In this work a parsimonious quasi-2D hydraulic model is adopted, having many advantages in terms of easy set-up. In order to represent the geometry of the study domain an high-resolution and up-to-date Digital Surface Model (DSM) is used. The accuracy in flood depth estimation is evaluated by comparison with marble-plate records of a historic flood in the city of Florence (Italy). The accuracy is characterized in the downtown most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 €/m2•year, corresponding to about 8.3% of residents income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the buildings characteristics. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This method is based on the georeferenced census system considered as optimal compromise between spatial detail and open availability of socio-economic data. The census sections system consist of geographically contiguous polygons that usually coincide with building blocks in dense urban areas. The results of flood risk assessment at the census section scale resolve most of

  3. Mammographic density and estimation of breast cancer risk in intermediate risk population.

    Science.gov (United States)

    Tesic, Vanja; Kolaric, Branko; Znaor, Ariana; Kuna, Sanja Kusacic; Brkljacic, Boris

    2013-01-01

    It is not clear to what extent mammographic density represents a risk factor for breast cancer among women with moderate risk for disease. We conducted a population-based study to estimate the independent effect of breast density on breast cancer risk and to evaluate the potential of breast density as a marker of risk in an intermediate risk population. From November 2006 to April 2009, data that included American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) breast density categories and risk information were collected on 52,752 women aged 50-69 years without previously diagnosed breast cancer who underwent screening mammography examination. A total of 257 screen-detected breast cancers were identified. Logistic regression was used to assess the effect of breast density on breast carcinoma risk and to control for other risk factors. The risk increased with density and the odds ratio for breast cancer among women with dense breast (heterogeneously and extremely dense breast), was 1.9 (95% confidence interval, 1.3-2.8) compared with women with almost entirely fat breasts, after adjustment for age, body mass index, age at menarche, age at menopause, age at first childbirth, number of live births, use of oral contraceptive, family history of breast cancer, prior breast procedures, and hormone replacement therapy use that were all significantly related to breast density (p density and decreased with number of live births. Our finding that mammographic density is an independent risk factor for breast cancer indicates the importance of breast density measurements for breast cancer risk assessment also in moderate risk populations. © 2012 Wiley Periodicals, Inc.

  4. Estimating time to pregnancy from current durations in a cross-sectional sample

    DEFF Research Database (Denmark)

    Keiding, Niels; Kvist, Kajsa; Hartvig, Helle;

    2002-01-01

    A new design for estimating the distribution of time to pregnancy is proposed and investigated. The design is based on recording current durations in a cross-sectional sample of women, leading to statistical problems similar to estimating renewal time distributions from backward recurrence times...

  5. Estimating time to pregnancy from current durations in a cross-sectional sample

    DEFF Research Database (Denmark)

    Keiding, Niels; Kvist, Kajsa; Hartvig, Helle;

    2002-01-01

    A new design for estimating the distribution of time to pregnancy is proposed and investigated. The design is based on recording current durations in a cross-sectional sample of women, leading to statistical problems similar to estimating renewal time distributions from backward recurrence times....

  6. Migraine and Risk of Stroke: Review of Current Evidence

    Directory of Open Access Journals (Sweden)

    Sadeghi

    2014-07-01

    Full Text Available Context Migraine is a kind of primary headache that affects 10% to 20% of people worldwide. Recent studies have shown that migraines can be involved in strokes incidences, especially ischemic strokes.Hence, the current study aimed to review evidence in relation to migraine and risk of stroke. Evidence Acquisition A literature search was done for related articles dated between 1993 and 2013 on PubMed, Science Direct, Embase, Web of Science and Scopus for both English and non-English language articles by entering “migraine”, “migraine with aura”, “headache” and “ischemic and hemorrhagic stroke” as keywords. Results In most evaluated studies, there was a positive association between migraine with aura (MA and strokes incidences, especially ischemic strokes. Moreover, patients with high frequency of migraine attacks had greater odds of having a stroke compared with those who had low frequency of migraine attacks. Also, the association between migraine and stroke was more significant in subjects under 45 years old. Some migraine symptoms such as vomiting and nausea had a protective role in the development of ischemic strokes. Conclusions Migraine, especially MA, is a risk factor for incidences of strokes, especially ischemic strokes. However, due to conflicting results on the association between different types of migraine and stroke, more studies are needed in this field.

  7. Work-site musculoskeletal pain risk estimates by trained observers--a prospective cohort study.

    Science.gov (United States)

    Coenen, Pieter; Kingma, Idsart; Boot, Cécile R L; Douwes, Marjolein; Bongers, Paulien M; van Dieën, Jaap H

    2012-01-01

    Work-related musculoskeletal pain (MSP) risk assessments by trained observers are often used in ergonomic practice; however, the validity may be questionable. We investigated the predictive value of work-site MSP risk estimates in a prospective cohort study of 1745 workers. Trained observers estimated the risk of MSP (neck, shoulder or low-back pain) using a three-point scale (high, moderate and low risk) after observing a video of randomly selected workers representing a task group. Associations of the estimated risk of pain and reported pain during a three-year follow-up were assessed using logistic regression. Estimated risk of neck and shoulder pain did (odds ratio, OR: 1.45 (95% confidence interval, CI: 1.01-2.08); 1.64 (95% CI: 1.05-2.55)), however, estimated risk of low-back pain did not significantly predict pain (OR: 1.27 (95% CI: 0.91-1.79)). The results show that observers were able to estimate the risk of shoulder and neck pain, whereas they found it difficult to estimate the risk of low-back pain. Practitioner Summary: Work-related musculoskeletal pain risk assessments by observers are often used in ergonomic practice. We showed that observers were able to estimate shoulder and neck pain risk, but had difficulties to estimate the risk of low-back pain. Therefore, observers' risk estimates might provide a useful method for musculoskeletal pain risk assessments.

  8. Modeling And Simulation of Speed and flux Estimator Based on Current & voltage Model

    Directory of Open Access Journals (Sweden)

    Dinesh Chandra Jain

    2011-10-01

    Full Text Available This paper introduce a estimator based on and current & voltage model used in induction motor (IM drive. The rotor speed estimation is based on the model reference adaptive system (MRAS approach. The closed loop control mechanism is based on the voltage and current model. The control and estimation algorithms utilize the synchronous coordinates as a frame of reference. A speed sensor less induction motor (IM drive with Robust control characteristics is introduced. First, a speed observation system, which is insensitive to the variations of motor parameters.

  9. Risk estimation for matrix recovery with spectral regularization

    CERN Document Server

    Deledalle, Charles-Alban; Peyré, Gabriel; Fadili, Jalal; Dossal, Charles

    2012-01-01

    In this paper, we develop an approach to recursively estimate the quadratic risk for matrix recovery problems regularized with spectral functions. Toward this end, in the spirit of the SURE theory, a key step is to compute the (weak) derivative and divergence of a solution with respect to the observations. As such a solution is not available in closed form, but rather through a proximal splitting algorithm, we propose to recursively compute the divergence from the sequence of iterates. A second challenge that we unlocked is the computation of the (weak) derivative of the proximity operator of a spectral function. To show the potential applicability of our approach, we exemplify it on a matrix completion problem to objectively and automatically select the regularization parameter.

  10. Declining bioavailability and inappropriate estimation of risk of persistent compounds

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, J.W.; Alexander, M. [Cornell Univ., Ithaca, NY (United States)

    1997-03-01

    Earthworms (Eisenia foetida) assimilated decreasing amounts of atrazine, phenanthrene, and naphthalene that had been incubated for increasing periods of time in sterile soil. The amount of atrazine and phenanthrene removed from soil by mild extractants also decreased with time. The declines in bioavailability of the three compounds to earthworms and of naphthalene to bacteria were not reflected by analysis involving vigorous methods of solvent extraction; similar results for bioavailability of phenanthrene and 4-nitrophenol to bacteria were obtained in a previous study conducted at this laboratory. The authors suggest that regulations based on vigorous extractions for the analyses of persistent organic pollutants in soil do not appropriately estimate exposure or risk to susceptible populations.

  11. Refining the risk estimate for transfusion-transmission of occult hepatitis B virus.

    Science.gov (United States)

    Seed, C R; Kiely, P; Hoad, V C; Keller, A J

    2017-01-01

    We previously published a model to estimate the residual risk (RR) for occult hepatitis B infection (OBI) in the absence of universal anti-HBc testing. To incorporate new information on the epidemiology of OBI, we describe model refinements and estimate a more accurate HBV RR due to OBI in Australia. In our original model, the OBI risk, p(OBI), was defined by the rate of 'non-detection' by the HBV DNA screening test in use, p(NAT non-detection), and the average infectivity of blood components from OBI donors, p(transmission). We revised the model by integrating three refinements: that donations with anti-HBs levels of >10 IU/l, or donations solely for manufactured plasma products, be excluded from the risk calculation, and an updated estimate of p(transmission). Refining our OBI RR model resulted in a more than 10-fold reduction in the reported RR risk to recipients from OBI in our donor population. Based on the use of a common data set, the mean OBI RR risk decreased from 1 in 374 354 donations (95% CI: 1 in 191 940-1 072 681) to 1 in 3 984 033 (95% CI: 1 in 1 146 188-65 268 257) for the refined model. Our model refinements provide a more realistic measure of the HBV RR in the donor population. Unlike the previous model, the new model demonstrates that the risk of HBV due to OBI in the Australian blood donor population is negligible, and further potentially cost-ineffective risk management strategies are not currently warranted. © 2016 International Society of Blood Transfusion.

  12. The current duration design for estimating the time to pregnancy distribution

    DEFF Research Database (Denmark)

    Gasbarra, Dario; Arjas, Elja; Vehtari, Aki

    2015-01-01

    This paper was inspired by the studies of Niels Keiding and co-authors on estimating the waiting time-to-pregnancy (TTP) distribution, and in particular on using the current duration design in that context. In this design, a cross-sectional sample of women is collected from those who are currently...... attempting to become pregnant, and then by recording from each the time she has been attempting. Our aim here is to study the identifiability and the estimation of the waiting time distribution on the basis of current duration data. The main difficulty in this stems from the fact that very short waiting...

  13. DC Link Current Estimation in Wind-Double Feed Induction Generator Power Conditioning System

    Directory of Open Access Journals (Sweden)

    MARIAN GAICEANU

    2010-12-01

    Full Text Available In this paper the implementation of the DC link current estimator in power conditioning system of the variable speed wind turbine is shown. The wind turbine is connected to double feed induction generator (DFIG. The variable electrical energy parameters delivered by DFIG are fitted with the electrical grid parameters through back-to-back power converter. The bidirectional AC-AC power converter covers a wide speed range from subsynchronous to supersynchronous speeds. The modern control of back-to-back power converter involves power balance concept, therefore its load power should be known in any instant. By using the power balance control, the DC link voltage variation at the load changes can be reduced. In this paper the load power is estimated from the dc link, indirectly, through a second order DC link current estimator. The load current estimator is based on the DC link voltage and on the dc link input current of the rotor side converter. This method presents certain advantages instead of using measured method, which requires a low pass filter: no time delay, the feedforward current component has no ripple, no additional hardware, and more fast control response. Through the numerical simulation the performances of the proposed DC link output current estimator scheme are demonstrated.

  14. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide—An Overview

    Science.gov (United States)

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools. PMID:28257103

  15. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide-An Overview.

    Science.gov (United States)

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-03-02

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools.

  16. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide—An Overview

    Directory of Open Access Journals (Sweden)

    Trine Madsen

    2017-03-01

    Full Text Available People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools.

  17. Highly intense lightning over the oceans: Estimated peak currents from global GLD360 observations

    OpenAIRE

    İnan, Umran Savaş; Said, R. K.; Cohen, M. B

    2013-01-01

    We present the ?rst global distribution of the average estimated peak currents in negative lightning ?ashes using 1 year of continuous data from the Vaisala global lightning data set GLD360. The data set, composed of 353 million ?ashes, was compared with the National Lightning Detection NetworkTM for peak current accuracy, location accuracy, and detection ef?ciency. The validation results demonstrated a mean (geometric mean) peak current magnitude error of 21% (6%), a median lo...

  18. Estimation of Risk Factors - Useful Tools in Assessing Calves Welfare

    Directory of Open Access Journals (Sweden)

    Ioana Andronie

    2013-05-01

    Full Text Available The study has been aimed at identify risk factors that may be used in welfare assessment of calves reared in intensive farming systems. These factors may be useful to the farmers in planning breeder measures in order to increase the animal welfare levels in relation to the legislative requirements. The estimation considered the housing conditions of calves aged 0-6 months grouped in two lots A (n: 50 and B (n: 60, depending on their accommodation system. We have monitored the calves decubitus on the housing surface, body hygiene as well as that of the resting area and the thermal comfort. The assessment was made by direct observation and numerical estimation, based on the Welfare Quality ® 2009 protocol (Assessment protocol for cattle as well as by means of a calves safety and welfare evaluation chart according to the European and national legislation on minimum calves safety and protection standards. Data collected and processed have shown the fact that not all housing conditions completely answer calves physiological requirements. Thus the appropriate housing criterion in the present study was met at B lot of 85 % and to a much smaller degree by the A lot (76 %. The assessment carried out by means of the safety chart have indicated that only the minimum criteria for calves rearing were met, which does not translate into a high level of their welfare.

  19. Gambling disorder: estimated prevalence rates and risk factors in Macao.

    Science.gov (United States)

    Wu, Anise M S; Lai, Mark H C; Tong, Kwok-Kit

    2014-12-01

    An excessive, problematic gambling pattern has been regarded as a mental disorder in the Diagnostic and Statistical Manual for Mental Disorders (DSM) for more than 3 decades (American Psychiatric Association [APA], 1980). In this study, its latest prevalence in Macao (one of very few cities with legalized gambling in China and the Far East) was estimated with 2 major changes in the diagnostic criteria, suggested by the 5th edition of DSM (APA, 2013): (a) removing the "Illegal Act" criterion, and (b) lowering the threshold for diagnosis. A random, representative sample of 1,018 Macao residents was surveyed with a phone poll design in January 2013. After the 2 changes were adopted, the present study showed that the estimated prevalence rate of gambling disorder was 2.1% of the Macao adult population. Moreover, the present findings also provided empirical support to the application of these 2 recommended changes when assessing symptoms of gambling disorder among Chinese community adults. Personal risk factors of gambling disorder, namely being male, having low education, a preference for casino gambling, as well as high materialism, were identified.

  20. Combining ungrouped and grouped wildfire data to estimate fire risk

    KAUST Repository

    Hernandez-Magallanes, I.

    2013-10-11

    © 2013 John Wiley & Sons, Ltd. Frequently, models are required to combine information obtained from different data sources and on different scales. In this work, we are interested in estimating the risk of wildfire ignition in the USA for a particular time and location by merging two levels of data, namely, individual points and aggregate count of points into areas. The data for federal lands consist of the point location and time of each fire. Nonfederal fires are aggregated by county for a particular year. The probability model is based on the wildfire point process. Assuming a smooth intensity function, a locally weighted likelihood fit is used, which incorporates the group effect. A logit model is used under the assumption of the existence of a latent process, and fuel conditions are included as a covariate. The model assessment is based on a residual analysis, while the False Discovery Rate detects spatial patterns. A benefit of the proposed model is that there is no need of arbitrary aggregation of individual fires into counts. A map of predicted probability of ignition for the Midwest US in 1990 is included. The predicted ignition probabilities and the estimated total number of expected fires are required for the allocation of resources.

  1. Combining estimates of interest in prognostic modelling studies after multiple imputation: current practice and guidelines

    Directory of Open Access Journals (Sweden)

    Holder Roger L

    2009-07-01

    Full Text Available Abstract Background Multiple imputation (MI provides an effective approach to handle missing covariate data within prognostic modelling studies, as it can properly account for the missing data uncertainty. The multiply imputed datasets are each analysed using standard prognostic modelling techniques to obtain the estimates of interest. The estimates from each imputed dataset are then combined into one overall estimate and variance, incorporating both the within and between imputation variability. Rubin's rules for combining these multiply imputed estimates are based on asymptotic theory. The resulting combined estimates may be more accurate if the posterior distribution of the population parameter of interest is better approximated by the normal distribution. However, the normality assumption may not be appropriate for all the parameters of interest when analysing prognostic modelling studies, such as predicted survival probabilities and model performance measures. Methods Guidelines for combining the estimates of interest when analysing prognostic modelling studies are provided. A literature review is performed to identify current practice for combining such estimates in prognostic modelling studies. Results Methods for combining all reported estimates after MI were not well reported in the current literature. Rubin's rules without applying any transformations were the standard approach used, when any method was stated. Conclusion The proposed simple guidelines for combining estimates after MI may lead to a wider and more appropriate use of MI in future prognostic modelling studies.

  2. Risk-targeted selection of agricultural holdings for post-epidemic surveillance: estimation of efficiency gains.

    Directory of Open Access Journals (Sweden)

    Ian G Handel

    Full Text Available Current post-epidemic sero-surveillance uses random selection of animal holdings. A better strategy may be to estimate the benefits gained by sampling each farm and use this to target selection. In this study we estimate the probability of undiscovered infection for sheep farms in Devon after the 2001 foot-and-mouth disease outbreak using the combination of a previously published model of daily infection risk and a simple model of probability of discovery of infection during the outbreak. This allows comparison of the system sensitivity (ability to detect infection in the area of arbitrary, random sampling compared to risk-targeted selection across a full range of sampling budgets. We show that it is possible to achieve 95% system sensitivity by sampling, on average, 945 farms with random sampling and 184 farms with risk-targeted sampling. We also examine the effect of ordering samples by risk to expedite return to a disease-free status. Risk ordering the sampling process results in detection of positive farms, if present, 15.6 days sooner than with randomly ordered sampling, assuming 50 farms are tested per day.

  3. Food allergy and risk assessment: Current status and future directions

    Science.gov (United States)

    Remington, Benjamin C.

    2017-09-01

    Risk analysis is a three part, interactive process that consists of a scientific risk assessment, a risk management strategy and an exchange of information through risk communication. Quantitative risk assessment methodologies are now available and widely used for assessing risks regarding the unintentional consumption of major, regulated allergens but new or modified proteins can also pose a risk of de-novo sensitization. The risks due to de-novo sensitization to new food allergies are harder to quantify. There is a need for a systematic, comprehensive battery of tests and assessment strategy to identify and characterise de-novo sensitization to new proteins and the risks associated with them. A risk assessment must be attuned to answer the risk management questions and needs. Consequently, the hazard and risk assessment methods applied and the desired information are determined by the requested outcome for risk management purposes and decisions to be made. The COST Action network (ImpARAS, www.imparas.eu) has recently started to discuss these risk management criteria from first principles and will continue with the broader subject of improving strategies for allergen risk assessment throughout 2016-2018/9.

  4. Efficient Estimation for Semiparametric Varying Coefficient Partially Linear Regression Models with Current Status Data

    Institute of Scientific and Technical Information of China (English)

    Tao Hu; Heng-jian Cui; Xing-wei Tong

    2009-01-01

    This article considers a semiparametric varying-coefficient partially linear regression model with current status data. The semiparametric varying-coefficient partially linear regression model which is a gen-eralization of the partially linear regression model and varying-coefficient regression model that allows one to explore the possibly nonlinear effect of a certain covariate on the response variable. A Sieve maximum likelihood estimation method is proposed and the asymptotic properties of the proposed estimators are discussed. Under some mild conditions, the estimators are shown to be strongly consistent. The convergence rate of the estima-tor for the unknown smooth function is obtained and the estimator for the unknown parameter is shown to be asymptotically efficient and normally distributed. Simulation studies are conducted to examine the small-sample properties of the proposed estimates and a real dataset is used to illustrate our approach.

  5. Energy-Extended CES Aggregate Production: Current Aspects of Their Specification and Econometric Estimation

    Directory of Open Access Journals (Sweden)

    Paul E. Brockway

    2017-02-01

    Full Text Available Capital–labour–energy Constant Elasticity of Substitution (CES production functions and their estimated parameters now form a key part of energy–economy models which inform energy and emissions policy. However, the collation and guidance as to the specification and estimation choices involved with such energy-extended CES functions is disparate. This risks poorly specified and estimated CES functions, with knock-on implications for downstream energy–economic models and climate policy. In response, as a first step, this paper assembles in one place the major considerations involved in the empirical estimation of these CES functions. Discussions of the choices and their implications lead to recommendations for CES empiricists. The extensive bibliography allows those interested to dig deeper into any aspect of the CES parameter estimation process.

  6. Soil-ecological risks for soil degradation estimation

    Science.gov (United States)

    Trifonova, Tatiana; Shirkin, Leonid; Kust, German; Andreeva, Olga

    2016-04-01

    Soil degradation includes the processes of soil properties and quality worsening, primarily from the point of view of their productivity and decrease of ecosystem services quality. Complete soil cover destruction and/or functioning termination of soil forms of organic life are considered as extreme stages of soil degradation, and for the fragile ecosystems they are normally considered in the network of their desertification, land degradation and droughts /DLDD/ concept. Block-model of ecotoxic effects, generating soil and ecosystem degradation, has been developed as a result of the long-term field and laboratory research of sod-podzol soils, contaminated with waste, containing heavy metals. The model highlights soil degradation mechanisms, caused by direct and indirect impact of ecotoxicants on "phytocenosis- soil" system and their combination, frequently causing synergistic effect. The sequence of occurring changes here can be formalized as a theory of change (succession of interrelated events). Several stages are distinguished here - from heavy metals leaching (releasing) in waste and their migration downward the soil profile to phytoproductivity decrease and certain phytocenosis composition changes. Phytoproductivity decrease leads to the reduction of cellulose content introduced into the soil. The described feedback mechanism acts as a factor of sod-podzolic soil self-purification and stability. It has been shown, that using phytomass productivity index, integrally reflecting the worsening of soil properties complex, it is possible to solve the problems dealing with the dose-reflecting reactions creation and determination of critical levels of load for phytocenosis and corresponding soil-ecological risks. Soil-ecological risk in "phytocenosis- soil" system means probable negative changes and the loss of some ecosystem functions during the transformation process of dead organic substance energy for the new biomass composition. Soil-ecological risks estimation is

  7. How much does HDL cholesterol add to risk estimation? A report from the SCORE Investigators.

    LENUS (Irish Health Repository)

    Cooney, Marie Therese

    2009-06-01

    Systematic COronary Risk Evaluation (SCORE), the risk estimation system recommended by the European guidelines on cardiovascular disease prevention, estimates 10-year risk of cardiovascular disease mortality based on age, sex, country of origin, systolic blood pressure, smoking status and either total cholesterol (TC) or TC\\/high-density lipoprotein cholesterol (HDL-C) ratio. As, counterintuitively, these two systems perform very similarly, we have investigated whether incorporating HDL-C and TC as separate variables improves risk estimation.

  8. Distribution of Estimated 10-Year Risk of Recurrent Vascular Events and Residual Risk in a Secondary Prevention Population.

    Science.gov (United States)

    Kaasenbrood, Lotte; Boekholdt, S Matthijs; van der Graaf, Yolanda; Ray, Kausik K; Peters, Ron J G; Kastelein, John J P; Amarenco, Pierre; LaRosa, John C; Cramer, Maarten J M; Westerink, Jan; Kappelle, L Jaap; de Borst, Gert J; Visseren, Frank L J

    2016-11-08

    Among patients with clinically manifest vascular disease, the risk of recurrent vascular events is likely to vary. We assessed the distribution of estimated 10-year risk of recurrent vascular events in a secondary prevention population. We also estimated the potential risk reduction and residual risk that can be achieved if patients reach guideline-recommended risk factor targets. The SMART score (Second Manifestations of Arterial Disease) for 10-year risk of myocardial infarction, stroke, or vascular death was applied to 6904 patients with vascular disease. The risk score was externally validated in 18 436 patients with various manifestations of vascular disease from the TNT (Treating to New Targets), IDEAL (Incremental Decrease in End Points Through Aggressive Lipid Lowering), SPARCL (Stroke Prevention by Aggressive Reduction in Cholesterol Levels), and CAPRIE (Clopidogrel Versus Aspirin in Patients at Risk of Ischemic Events) trials. The residual risk at guideline-recommended targets was estimated by applying relative risk reductions from meta-analyses to the estimated risk for targets for systolic blood pressure, low-density lipoprotein cholesterol, smoking, physical activity, and use of antithrombotic agents. The external performance of the SMART risk score was reasonable, apart from overestimation of risk in patients with 10-year risk >40%. In patients with various manifestations of vascular disease, median 10-year risk of a recurrent major vascular event was 17% (interquartile range, 11%-28%), varying from 30% in 22% of the patients. If risk factors were at guideline-recommended targets, the residual 10-year risk would be 30% in 9% of the patients (median, 11%; interquartile range, 7%-17%). Among patients with vascular disease, there is very substantial variation in estimated 10-year risk of recurrent vascular events. If all modifiable risk factors were at guideline-recommended targets, half of the patients would have a 10-year risk 20% and even >30% 10

  9. Microcephaly Case Fatality Rate Associated with Zika Virus Infection in Brazil: Current Estimates.

    Science.gov (United States)

    Cunha, Antonio José Ledo Alves da; de Magalhães-Barbosa, Maria Clara; Lima-Setta, Fernanda; Medronho, Roberto de Andrade; Prata-Barbosa, Arnaldo

    2017-05-01

    Considering the currently confirmed cases of microcephaly and related deaths associated with Zika virus in Brazil, the estimated case fatality rate is 8.3% (95% confidence interval: 7.2-9.6). However, a third of the reported cases remain under investigation. If the confirmation rates of cases and deaths are the same in the future, the estimated case fatality rate will be as high as 10.5% (95% confidence interval: 9.5-11.7).

  10. Risk model for estimating the 1-year risk of deferred lesion intervention following deferred revascularization after fractional flow reserve assessment.

    Science.gov (United States)

    Depta, Jeremiah P; Patel, Jayendrakumar S; Novak, Eric; Gage, Brian F; Masrani, Shriti K; Raymer, David; Facey, Gabrielle; Patel, Yogesh; Zajarias, Alan; Lasala, John M; Amin, Amit P; Kurz, Howard I; Singh, Jasvindar; Bach, Richard G

    2015-02-21

    Although lesions deferred revascularization following fractional flow reserve (FFR) assessment have a low risk of adverse cardiac events, variability in risk for deferred lesion intervention (DLI) has not been previously evaluated. The aim of this study was to develop a prediction model to estimate 1-year risk of DLI for coronary lesions where revascularization was not performed following FFR assessment. A prediction model for DLI was developed from a cohort of 721 patients with 882 coronary lesions where revascularization was deferred based on FFR between 10/2002 and 7/2010. Deferred lesion intervention was defined as any revascularization of a lesion previously deferred following FFR. The final DLI model was developed using stepwise Cox regression and validated using bootstrapping techniques. An algorithm was constructed to predict the 1-year risk of DLI. During a mean (±SD) follow-up period of 4.0 ± 2.3 years, 18% of lesions deferred after FFR underwent DLI; the 1-year incidence of DLI was 5.3%, while the predicted risk of DLI varied from 1 to 40%. The final Cox model included the FFR value, age, current or former smoking, history of coronary artery disease (CAD) or prior percutaneous coronary intervention, multi-vessel CAD, and serum creatinine. The c statistic for the DLI prediction model was 0.66 (95% confidence interval, CI: 0.61-0.70). Patients deferred revascularization based on FFR have variation in their risk for DLI. A clinical prediction model consisting of five clinical variables and the FFR value can help predict the risk of DLI in the first year following FFR assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  11. ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre

    2011-06-22

    This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.

  12. Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    OpenAIRE

    Cotter, John; dowd, kevin

    2011-01-01

    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and the...

  13. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Science.gov (United States)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  14. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr; Paramythiotis, Spyridon, E-mail: pskan@aua.gr [Laboratory of Food Quality Control and Hygiene, Department of Food Science and Technology, Agricultural University of Athens, Iera Odos 75, 118 55, Athens (Greece)

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  15. Non-invasively measured cardiac magnetic field maps improve the estimation of the current distribution

    OpenAIRE

    Kosch, Olaf; Steinhoff, Uwe; Trahms, Lutz; Trontelj, Zvonko; Jazbinšek, Vojko

    2015-01-01

    Comprehensive body surface potential mapping (BSPM) and magnetic field mapping (MFM) measurements have been carried out in order to improve the estimation of the current distribution generated by the human heart. Electric and magnetic fields and also the planar gradient of the magnetic field during the QRS complex were imaged as a time series of field maps. A model of the current distribution should explain the features of both BSPM and MFM. Simulated maps generated by a single dipole or a st...

  16. Methodology for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    This model-based approach uses data from both the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS) to produce estimates of the prevalence rates of cancer risk factors and screening behaviors at the state, health service area, and county levels.

  17. Three-dimensional ventricular activation imaging by means of equivalent current source modeling and estimation.

    Science.gov (United States)

    Liu, Z; Liu, C; He, B

    2006-01-01

    This paper presents a novel electrocardiographic inverse approach for imaging the 3-D ventricular activation sequence based on the modeling and estimation of the equivalent current density throughout the entire myocardial volume. The spatio-temporal coherence of the ventricular excitation process is utilized to derive the activation time from the estimated time course of the equivalent current density. At each time instant during the period of ventricular activation, the distributed equivalent current density is noninvasively estimated from body surface potential maps (BSPM) using a weighted minimum norm approach with a spatio-temporal regularization strategy based on the singular value decomposition of the BSPMs. The activation time at any given location within the ventricular myocardium is determined as the time point with the maximum local current density estimate. Computer simulation has been performed to evaluate the capability of this approach to image the 3-D ventricular activation sequence initiated from a single pacing site in a physiologically realistic cellular automaton heart model. The simulation results demonstrate that the simulated "true" activation sequence can be accurately reconstructed with an average correlation coefficient of 0.90, relative error of 0.19, and the origin of ventricular excitation can be localized with an average localization error of 5.5 mm for 12 different pacing sites distributed throughout the ventricles.

  18. Estimation of friction parameters in gravity currents by data assimilation in a model hierarchy

    Directory of Open Access Journals (Sweden)

    A. Wirth

    2011-01-01

    Full Text Available This paper is the last in a series of three investigating the friction laws and their parametrisation in idealised gravity currents in a rotating frame. Results on the dynamics of a gravity current (Wirth, 2009 and on the estimation of friction laws by data assimilation (Wirth and Verron, 2008 are combined to estimate the friction parameters and discriminate between friction laws in non-hydrostatic numerical simulations of gravity current dynamics, using data assimilation and a reduced gravity shallow water model.

    I demonstrate, that friction parameters and laws in gravity currents can be estimated using data assimilation. The results clearly show that friction follows a linear Rayleigh law for small Reynolds numbers and the estimated value agrees well with the analytical value obtained for non-accelerating Ekman layers. A significant and sudden departure towards a quadratic drag law at an Ekman layer based Reynolds number of around 800 is shown, in agreement with classical laboratory experiments. The drag coefficient obtained compare well to friction values over smooth surfaces. I show that data assimilation can be used to determine friction parameters and discriminate between friction laws and that it is a powerful tool in systematically connection models within a model hierarchy.

  19. Estimation of friction parameters in gravity currents by data assimilation in a model hierarchy

    Directory of Open Access Journals (Sweden)

    A. Wirth

    2011-04-01

    Full Text Available This paper is the last in a series of three investigating the friction laws and their parametrisation in idealised gravity currents in a rotating frame. Results on the dynamics of a gravity current (Wirth, 2009 and on the estimation of friction laws by data assimilation (Wirth and Verron, 2008 are combined to estimate the friction parameters and discriminate between friction laws in non-hydrostatic numerical simulations of gravity current dynamics, using data assimilation and a reduced gravity shallow water model.

    I demonstrate, that friction parameters and laws in gravity currents can be estimated using data assimilation. The results clearly show that friction follows a linear Rayleigh law for small Reynolds numbers and the estimated value agrees well with the analytical value obtained for non-accelerating Ekman layers. A significant and sudden departure towards a quadratic drag law at an Ekman layer based Reynolds number of around 800 is shown, in agreement with classical laboratory experiments. The drag coefficient obtained compares well to friction values over smooth surfaces. I show that data assimilation can be used to determine friction parameters and discriminate between friction laws and that it is a powerful tool in systematically connecting models within a model hierarchy.

  20. Estimation of current density distribution of PAFC by analysis of cell exhaust gas

    Energy Technology Data Exchange (ETDEWEB)

    Kato, S.; Seya, A. [Fuji Electric Co., Ltd., Ichihara-shi (Japan); Asano, A. [Fuji Electric Corporate, Ltd., Yokosuka-shi (Japan)

    1996-12-31

    To estimate distributions of Current densities, voltages, gas concentrations, etc., in phosphoric acid fuel cell (PAFC) stacks, is very important for getting fuel cells with higher quality. In this work, we leave developed a numerical simulation tool to map out the distribution in a PAFC stack. And especially to Study Current density distribution in the reaction area of the cell, we analyzed gas composition in several positions inside a gas outlet manifold of the PAFC stack. Comparing these measured data with calculated data, the current density distribution in a cell plane calculated by the simulation, was certified.

  1. Screening Mammography: Patient Perceptions and Preferences Regarding Communication of Estimated Breast Cancer Risk.

    Science.gov (United States)

    Amornsiripanitch, Nita; Mangano, Mark; Niell, Bethany L

    2017-05-01

    Many models exist to estimate a woman's risk of development of breast cancer. At screening mammography, many imaging centers collect data required for these models to identify women who may benefit from supplemental screening and referral for cancer risk assessment. The purpose of this study was to discern perceptions and preferences of screening mammography patients regarding communication of estimated breast cancer risk. An anonymous survey was distributed to screening and surveillance mammography patients between April and June 2015. Survey questions were designed to assess patient preferences regarding the receipt and complexity of risk estimate communication, including hypothetical scenarios with and without > 20% estimated risk of breast cancer. The McNemar test and the Wilcoxon signed rank test were used with p ≤ 0.05 considered statistically significant. The survey was distributed to 1061 screening and surveillance mammography patients, and 503 patients responded (response rate, 47%). Although 86% (431/503) of patients expressed interest in learning their estimated risk, only 8% (38/503) had undergone formal risk assessment. The preferred method (241 respondents [26%]) of communication of risk 20%, patients preferred oral communication and were 10-fold as likely to choose only oral communication (p 20%, patients preferred to learn their estimated risk in great detail (69% and 85%), although women were significantly more likely to choose greater detail for risk > 20% (p < 0.00001). Screening mammography patients expressed interest in learning their estimated risk of breast cancer regardless of their level of hypothetical risk.

  2. Assessment and care for non-medical risk factors in current antenatal health care

    NARCIS (Netherlands)

    A.A. Vos (Amber); Leeman, A. (Annemiek); W. Waelput (Wim); G.J. Bonsel (Gouke); E.A.P. Steegers (Eric); S. Denktaş (Semiha)

    2015-01-01

    textabstractObjective: this study aims to identify current practice in risk assessment, current antenatal policy and referral possibilities for non-medical risk factors (lifestyle and social risk factors), and to explore the satisfaction among obstetric caregivers in their collaboration with

  3. Cancer risk estimation caused by radiation exposure during endovascular procedure

    Science.gov (United States)

    Kang, Y. H.; Cho, J. H.; Yun, W. S.; Park, K. H.; Kim, H. G.; Kwon, S. M.

    2014-05-01

    The objective of this study was to identify the radiation exposure dose of patients, as well as staff caused by fluoroscopy for C-arm-assisted vascular surgical operation and to estimate carcinogenic risk due to such exposure dose. The study was conducted in 71 patients (53 men and 18 women) who had undergone vascular surgical intervention at the division of vascular surgery in the University Hospital from November of 2011 to April of 2012. It had used a mobile C-arm device and calculated the radiation exposure dose of patient (dose-area product, DAP). Effective dose was measured by attaching optically stimulated luminescence on the radiation protectors of staff who participates in the surgery to measure the radiation exposure dose of staff during the vascular surgical operation. From the study results, DAP value of patients was 308.7 Gy cm2 in average, and the maximum value was 3085 Gy cm2. When converted to the effective dose, the resulted mean was 6.2 m Gy and the maximum effective dose was 61.7 milliSievert (mSv). The effective dose of staff was 3.85 mSv; while the radiation technician was 1.04 mSv, the nurse was 1.31 mSv. All cancer incidences of operator are corresponding to 2355 persons per 100,000 persons, which deemed 1 of 42 persons is likely to have all cancer incidences. In conclusion, the vascular surgeons should keep the radiation protection for patient, staff, and all participants in the intervention in mind as supervisor of fluoroscopy while trying to understand the effects by radiation by themselves to prevent invisible danger during the intervention and to minimize the harm.

  4. Interactions of Lipid Genetic Risk Scores with Estimates of Metabolic Health in a Danish Population

    DEFF Research Database (Denmark)

    Justesen, Johanne M; Allin, Kristine H; Sandholt, Camilla H

    2015-01-01

    Background—There are several well-established lifestyle factors influencing dyslipidemia and currently; 157 genetic susceptibility loci have been reported to be associated with serum lipid levels at genome-wide statistical significance. However, the interplay between lifestyle risk factors...... and these susceptibility loci has not been fully elucidated. We tested whether genetic risk scores (GRS) of lipid-associated single nucleotide polymorphisms associate with fasting serum lipid traits and whether the effects are modulated by lifestyle factors or estimates of metabolic health. Methods and Results—The single......-cholesterol, high-density lipoprotein-cholesterol, or triglyceride, 4 weighted GRS were constructed. In a cross-sectional design, we investigated whether the effect of these weighted GRSs on lipid levels were modulated by diet, alcohol consumption, physical activity, and smoking or the individual metabolic health...

  5. Capsaicinoids Modulating Cardiometabolic Syndrome Risk Factors: Current Perspectives

    Directory of Open Access Journals (Sweden)

    Vijaya Juturu

    2016-01-01

    Full Text Available Capsaicinoids are bioactive nutrients present within red hot peppers reported to cut ad libitum food intake, to increase energy expenditure (thermogenesis and lipolysis, and to result in weight loss over time. In addition it has shown more benefits such as improvement in reducing oxidative stress and inflammation, improving vascular health, improving endothelial function, lowering blood pressure, reducing endothelial cytokines, cholesterol lowering effects, reducing blood glucose, improving insulin sensitivity, and reducing inflammatory risk factors. All these beneficial effects together help to modulate cardiometabolic syndrome risk factors. The early identification of cardiometabolic risk factors can help try to prevent obesity, hypertension, diabetes, and cardiovascular disease.

  6. ADDITIVE-MULTIPLICATIVE MODEL FOR RISK ESTIMATION IN THE PRODUCTION OF ROCKET AND SPACE TECHNICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2014-10-01

    Full Text Available For the first time we have developed a general additive-multiplicative model of the risk estimation (to estimate the probabilities of risk events. In the two-level system in the lower level the risk estimates are combined additively, on the top – in a multiplicative way. Additive-multiplicative model was used for risk estimation for (1 implementation of innovative projects at universities (with external partners, (2 the production of new innovative products, (3 the projects for creation of rocket and space equipmen

  7. How Should Risk-Based Regulation Reflect Current Public Opinion?

    Science.gov (United States)

    Pollock, Christopher John

    2016-08-01

    Risk-based regulation of novel agricultural products with public choice manifest via traceability and labelling is a more effective approach than the use of regulatory processes to reflect public concerns, which may not always be supported by evidence.

  8. Current Credit Risk Management Practices in Chinese Banking Industry

    OpenAIRE

    Zheng, Hao

    2008-01-01

    ABSTRACT Bank loans can be characterized as the engine of the Chinese economy as the economy is almost financed by bank loans. As a result, Credit risk management in Chinese banks is not only the issue to those banks, but it is also essential to the stability of the whole economy. Inadequate credit risk management practices can create an unforeseeable disaster to China in the future, especially with a significant increase in credit expansion. In the last decade, the government has tried ha...

  9. Lipoprotein(a) as a cardiovascular risk factor: current status

    DEFF Research Database (Denmark)

    Nordestgaard, Børge G; Chapman, M John; Ray, Kausik

    2010-01-01

    The aims of the study were, first, to critically evaluate lipoprotein(a) [Lp(a)] as a cardiovascular risk factor and, second, to advise on screening for elevated plasma Lp(a), on desirable levels, and on therapeutic strategies.......The aims of the study were, first, to critically evaluate lipoprotein(a) [Lp(a)] as a cardiovascular risk factor and, second, to advise on screening for elevated plasma Lp(a), on desirable levels, and on therapeutic strategies....

  10. Estimation of 10-Year Risk of Coronary Heart Disease in Nepalese Patients with Type 2 Diabetes: Framingham Versus United Kingdom Prospective Diabetes Study.

    Science.gov (United States)

    Pokharel, Daya Ram; Khadka, Dipendra; Sigdel, Manoj; Yadav, Naval Kishor; Sapkota, Lokendra Bahadur; Kafle, Ramchandra; Nepal, Sarthak; Sapkota, Ravindra Mohan; Choudhary, Niraj

    2015-08-01

    Predicting future coronary heart disease (CHD) risk with the help of a validated risk prediction function helps clinicians identify diabetic patients at high risk and provide them with appropriate preventive medicine. The aim of this study is to estimate and compare 10-year CHD risks of Nepalese diabetic patients using two most common risk prediction functions: The Framingham risk equation and United Kingdom Prospective Diabetes Study (UKPDS) risk engine that are yet to be validated for Nepalese population. We conducted a hospital-based, cross-sectional study on 524 patients with type 2 diabetes. Baseline and biochemical variables of individual patients were recorded and CHD risks were estimated by the Framingham and UKPDS risk prediction functions. Estimated risks were categorized as low, medium, and high. The estimated CHD risks were compared using kappa statistics, Pearson's bivariate correlation, Bland-Altman plots, and multiple regression analysis. The mean 10-year CHD risks estimated by the Framingham and UKPDS risk functions were 17.7 ± 12.1 and 16.8 ± 15 (bias: 0.88, P > 0.05), respectively, and were always higher in males and older age groups (P < 0.001). The two risk functions showed moderate convergent validity in predicting CHD risks, but differed in stratifying them and explaining the patients' risk profile. The Framingham equation predicted higher risk for patients usually below 70 years and showed better association with their current risk profile than the UKPDS risk engine. Based on the predicted risk, Nepalese diabetic patients, particularly those associated with increased numbers of risk factors, bear higher risk of future CHDs. Since this study is a cross-sectional one and uses externally validated risk functions, Nepalese clinicians should use them with caution, and preferably in combination with other guidelines, while making important medical decisions in preventive therapy of CHD.

  11. Estimation of 10-year risk of coronary heart disease in nepalese patients with type 2 diabetes: Framingham versus United Kingdom prospective diabetes study

    Directory of Open Access Journals (Sweden)

    Daya Ram Pokharel

    2015-01-01

    Full Text Available Background: Predicting future coronary heart disease (CHD risk with the help of a validated risk prediction function helps clinicians identify diabetic patients at high risk and provide them with appropriate preventive medicine. Aim: The aim of this study is to estimate and compare 10-year CHD risks of Nepalese diabetic patients using two most common risk prediction functions: The Framingham risk equation and United Kingdom Prospective Diabetes Study (UKPDS risk engine that are yet to be validated for Nepalese population. Patients and Methods: We conducted a hospital-based, cross-sectional study on 524 patients with type 2 diabetes. Baseline and biochemical variables of individual patients were recorded and CHD risks were estimated by the Framingham and UKPDS risk prediction functions. Estimated risks were categorized as low, medium, and high. The estimated CHD risks were compared using kappa statistics, Pearson′s bivariate correlation, Bland-Altman plots, and multiple regression analysis. Results: The mean 10-year CHD risks estimated by the Framingham and UKPDS risk functions were 17.7 ± 12.1 and 16.8 ± 15 (bias: 0.88, P > 0.05, respectively, and were always higher in males and older age groups (P < 0.001. The two risk functions showed moderate convergent validity in predicting CHD risks, but differed in stratifying them and explaining the patients′ risk profile. The Framingham equation predicted higher risk for patients usually below 70 years and showed better association with their current risk profile than the UKPDS risk engine. Conclusions: Based on the predicted risk, Nepalese diabetic patients, particularly those associated with increased numbers of risk factors, bear higher risk of future CHDs. Since this study is a cross-sectional one and uses externally validated risk functions, Nepalese clinicians should use them with caution, and preferably in combination with other guidelines, while making important medical decisions in

  12. A Method to Simultaneously Detect the Current Sensor Fault and Estimate the State of Energy for Batteries in Electric Vehicles

    Science.gov (United States)

    Xu, Jun; Wang, Jing; Li, Shiying; Cao, Binggang

    2016-01-01

    Recently, State of energy (SOE) has become one of the most fundamental parameters for battery management systems in electric vehicles. However, current information is critical in SOE estimation and current sensor is usually utilized to obtain the latest current information. However, if the current sensor fails, the SOE estimation may be confronted with large error. Therefore, this paper attempts to make the following contributions: Current sensor fault detection and SOE estimation method is realized simultaneously. Through using the proportional integral observer (PIO) based method, the current sensor fault could be accurately estimated. By taking advantage of the accurate estimated current sensor fault, the influence caused by the current sensor fault can be eliminated and compensated. As a result, the results of the SOE estimation will be influenced little by the fault. In addition, the simulation and experimental workbench is established to verify the proposed method. The results indicate that the current sensor fault can be estimated accurately. Simultaneously, the SOE can also be estimated accurately and the estimation error is influenced little by the fault. The maximum SOE estimation error is less than 2%, even though the large current error caused by the current sensor fault still exists. PMID:27548183

  13. A Method to Simultaneously Detect the Current Sensor Fault and Estimate the State of Energy for Batteries in Electric Vehicles.

    Science.gov (United States)

    Xu, Jun; Wang, Jing; Li, Shiying; Cao, Binggang

    2016-08-19

    Recently, State of energy (SOE) has become one of the most fundamental parameters for battery management systems in electric vehicles. However, current information is critical in SOE estimation and current sensor is usually utilized to obtain the latest current information. However, if the current sensor fails, the SOE estimation may be confronted with large error. Therefore, this paper attempts to make the following contributions: Current sensor fault detection and SOE estimation method is realized simultaneously. Through using the proportional integral observer (PIO) based method, the current sensor fault could be accurately estimated. By taking advantage of the accurate estimated current sensor fault, the influence caused by the current sensor fault can be eliminated and compensated. As a result, the results of the SOE estimation will be influenced little by the fault. In addition, the simulation and experimental workbench is established to verify the proposed method. The results indicate that the current sensor fault can be estimated accurately. Simultaneously, the SOE can also be estimated accurately and the estimation error is influenced little by the fault. The maximum SOE estimation error is less than 2%, even though the large current error caused by the current sensor fault still exists.

  14. A framework for estimating radiation-related cancer risks in Japan from the 2011 Fukushima nuclear accident.

    Science.gov (United States)

    Walsh, L; Zhang, W; Shore, R E; Auvinen, A; Laurier, D; Wakeford, R; Jacob, P; Gent, N; Anspaugh, L R; Schüz, J; Kesminiene, A; van Deventer, E; Tritscher, A; del Rosarion Pérez, M

    2014-11-01

    We present here a methodology for health risk assessment adopted by the World Health Organization that provides a framework for estimating risks from the Fukushima nuclear accident after the March 11, 2011 Japanese major earthquake and tsunami. Substantial attention has been given to the possible health risks associated with human exposure to radiation from damaged reactors at the Fukushima Daiichi nuclear power station. Cumulative doses were estimated and applied for each post-accident year of life, based on a reference level of exposure during the first year after the earthquake. A lifetime cumulative dose of twice the first year dose was estimated for the primary radionuclide contaminants ((134)Cs and (137)Cs) and are based on Chernobyl data, relative abundances of cesium isotopes, and cleanup efforts. Risks for particularly radiosensitive cancer sites (leukemia, thyroid and breast cancer), as well as the combined risk for all solid cancers were considered. The male and female cumulative risks of cancer incidence attributed to radiation doses from the accident, for those exposed at various ages, were estimated in terms of the lifetime attributable risk (LAR). Calculations of LAR were based on recent Japanese population statistics for cancer incidence and current radiation risk models from the Life Span Study of Japanese A-bomb survivors. Cancer risks over an initial period of 15 years after first exposure were also considered. LAR results were also given as a percentage of the lifetime baseline risk (i.e., the cancer risk in the absence of radiation exposure from the accident). The LAR results were based on either a reference first year dose (10 mGy) or a reference lifetime dose (20 mGy) so that risk assessment may be applied for relocated and non-relocated members of the public, as well as for adult male emergency workers. The results show that the major contribution to LAR from the reference lifetime dose comes from the first year dose. For a dose of 10 mGy in

  15. Convolution-based estimation of organ dose in tube current modulated CT

    Science.gov (United States)

    Tian, Xiaoyu; Segars, W. Paul; Dixon, Robert L.; Samei, Ehsan

    2016-05-01

    Estimating organ dose for clinical patients requires accurate modeling of the patient anatomy and the dose field of the CT exam. The modeling of patient anatomy can be achieved using a library of representative computational phantoms (Samei et al 2014 Pediatr. Radiol. 44 460-7). The modeling of the dose field can be challenging for CT exams performed with a tube current modulation (TCM) technique. The purpose of this work was to effectively model the dose field for TCM exams using a convolution-based method. A framework was further proposed for prospective and retrospective organ dose estimation in clinical practice. The study included 60 adult patients (age range: 18-70 years, weight range: 60-180 kg). Patient-specific computational phantoms were generated based on patient CT image datasets. A previously validated Monte Carlo simulation program was used to model a clinical CT scanner (SOMATOM Definition Flash, Siemens Healthcare, Forchheim, Germany). A practical strategy was developed to achieve real-time organ dose estimation for a given clinical patient. CTDIvol-normalized organ dose coefficients ({{h}\\text{Organ}} ) under constant tube current were estimated and modeled as a function of patient size. Each clinical patient in the library was optimally matched to another computational phantom to obtain a representation of organ location/distribution. The patient organ distribution was convolved with a dose distribution profile to generate {{≤ft(\\text{CTD}{{\\text{I}}\\text{vol}}\\right)}\\text{organ, \\text{convolution}}} values that quantified the regional dose field for each organ. The organ dose was estimated by multiplying {{≤ft(\\text{CTD}{{\\text{I}}\\text{vol}}\\right)}\\text{organ, \\text{convolution}}} with the organ dose coefficients ({{h}\\text{Organ}} ). To validate the accuracy of this dose estimation technique, the organ dose of the original clinical patient was estimated using Monte Carlo program with TCM profiles explicitly modeled. The

  16. Convolution-based estimation of organ dose in tube current modulated CT.

    Science.gov (United States)

    Tian, Xiaoyu; Segars, W Paul; Dixon, Robert L; Samei, Ehsan

    2016-05-21

    Estimating organ dose for clinical patients requires accurate modeling of the patient anatomy and the dose field of the CT exam. The modeling of patient anatomy can be achieved using a library of representative computational phantoms (Samei et al 2014 Pediatr. Radiol. 44 460-7). The modeling of the dose field can be challenging for CT exams performed with a tube current modulation (TCM) technique. The purpose of this work was to effectively model the dose field for TCM exams using a convolution-based method. A framework was further proposed for prospective and retrospective organ dose estimation in clinical practice. The study included 60 adult patients (age range: 18-70 years, weight range: 60-180 kg). Patient-specific computational phantoms were generated based on patient CT image datasets. A previously validated Monte Carlo simulation program was used to model a clinical CT scanner (SOMATOM Definition Flash, Siemens Healthcare, Forchheim, Germany). A practical strategy was developed to achieve real-time organ dose estimation for a given clinical patient. CTDIvol-normalized organ dose coefficients ([Formula: see text]) under constant tube current were estimated and modeled as a function of patient size. Each clinical patient in the library was optimally matched to another computational phantom to obtain a representation of organ location/distribution. The patient organ distribution was convolved with a dose distribution profile to generate [Formula: see text] values that quantified the regional dose field for each organ. The organ dose was estimated by multiplying [Formula: see text] with the organ dose coefficients ([Formula: see text]). To validate the accuracy of this dose estimation technique, the organ dose of the original clinical patient was estimated using Monte Carlo program with TCM profiles explicitly modeled. The discrepancy between the estimated organ dose and dose simulated using TCM Monte Carlo program was quantified. We further compared the

  17. Fracture Risk Analysis in Postmenopausal Women with the Current Methods

    Directory of Open Access Journals (Sweden)

    Salih Gultekin

    2014-03-01

    Full Text Available Aim: This study was conducted to assess the risk of fracture in postmenopausal women using dual x-ray absorptiometry bone mineral density (DEXA-BMD as a reference method and FRAX as a new clinical risk assessment tool. Material and Method: 168 postmenopausal women (> 50 years evaluating with DEXA-BMD and FRAX methods were included in the study. Femoral BMD (F-BMD, femoral T-score (F-Ts, lumbar spine BMD (L-BMD and lumbar spine T-score (L-Ts values of the patients were calculated. Fracture risk assessments were carried out using T-score values and FRAX 10-year hip fracture (HF and major osteoporotic fracture (MOF risk ratios. Data were analyzed statistically. Results: According to the results of F-Ts and L-Ts, 44/168 (26.2% and 65/168 (38.7% of patients had osteoporosis as compatible with high fracture risk. In osteoporotic patients, mean values for F-Ts L-Ts, F-BMD and L-BMD were -2.8 ± 0.4, -3.2 ± 0.5, 0.530 ± 0.049 and 0.682 ± 0.066, respectively. There were found to be high MOF risk in 16/168 (9.5% and high HF risk in 51/168 (30.4% of patients according to FRAX. Positive correlations were determined between F-Ts and L-Ts (moderate; rho = 0.424, p <0.05 and between HF and MOF (strong; rho = 0.958, p <0001. There were strong negative correlations among HF and MOF with F-Ts (respectively, rho = -0.897 and rho = -0.844, p <0.001 and moderate negative correlations among HF and MOF with L-Ts (respectively, rho = -0.535 and rho = - 0.567, p <0.05. Discussion: In postmenopausal women with osteoporosis, risk assessment by the FRAX besides the DXA-BMD measurements can be useful for not to be missed of patients with high risk of fracture.

  18. A general binomial regression model to estimate standardized risk differences from binary response data.

    Science.gov (United States)

    Kovalchik, Stephanie A; Varadhan, Ravi; Fetterman, Barbara; Poitras, Nancy E; Wacholder, Sholom; Katki, Hormuzd A

    2013-02-28

    Estimates of absolute risks and risk differences are necessary for evaluating the clinical and population impact of biomedical research findings. We have developed a linear-expit regression model (LEXPIT) to incorporate linear and nonlinear risk effects to estimate absolute risk from studies of a binary outcome. The LEXPIT is a generalization of both the binomial linear and logistic regression models. The coefficients of the LEXPIT linear terms estimate adjusted risk differences, whereas the exponentiated nonlinear terms estimate residual odds ratios. The LEXPIT could be particularly useful for epidemiological studies of risk association, where adjustment for multiple confounding variables is common. We present a constrained maximum likelihood estimation algorithm that ensures the feasibility of risk estimates of the LEXPIT model and describe procedures for defining the feasible region of the parameter space, judging convergence, and evaluating boundary cases. Simulations demonstrate that the methodology is computationally robust and yields feasible, consistent estimators. We applied the LEXPIT model to estimate the absolute 5-year risk of cervical precancer or cancer associated with different Pap and human papillomavirus test results in 167,171 women undergoing screening at Kaiser Permanente Northern California. The LEXPIT model found an increased risk due to abnormal Pap test in human papillomavirus-negative that was not detected with logistic regression. Our R package blm provides free and easy-to-use software for fitting the LEXPIT model.

  19. Latent-failure risk estimates for computer control

    Science.gov (United States)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  20. Improved estimates of the European winter wind storm climate and the risk of reinsurance loss

    Science.gov (United States)

    Della-Marta, P. M.; Liniger, M. A.; Appenzeller, C.; Bresch, D. N.; Koellner-Heck, P.; Muccione, V.

    2009-04-01

    Current estimates of the European wind storm climate and their associated losses are often hampered by either relatively short, coarse resolution or inhomogeneous datasets. This study estimates the European wind storm climate using dynamical seasonal-to-decadal (s2d) climate forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF). The current s2d models' have limited predictive skill of European storminess, making the ensemble forecasts ergodic samples on which to build pseudo climates of 310 to 396 years in length. Extended winter (ONDJFMA) wind storm climatologies are created using a scalar extreme wind index considering only data above a high threshold. The method identifes between 2331 and 2471 wind storms using s2d data and 380 wind storms in ERA-40. Classical extreme value analysis (EVA) techniques are used to determine the wind storm climatologies. We suggest that the ERA-40 climatology, by virtue of its length, limiting form, and the fitting method, overestimates the return period (RP) of wind storms with RPs between 10-300 years and underestimates the return period of wind storms with RPs greater than 300 years. A 50 year event in ERA-40 is approximately a 33 year event using s2d. The largest influence on ERA-40 RP uncertainties is the sampling variability associated with only 45 seasons of storms. The climatologies are linked to the Swiss Reinsurance Company (Swiss Re) European wind storm loss model. New estimates of the risk of loss are compared with those from historical and stochastically generated wind storm fields used by Swiss Re. The resulting loss-frequency relationship matches well with the two independently modelled estimates and clearly demonstrates the added value by using alternative data and methods, as proposed in this study, to estimate the RP of high RP losses.

  1. Option-Based Estimation of the Price of Co-Skewness and Co-Kurtosis Risk

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Fournier, Mathieu; Fournier, Mathieu;

    -neutral second moments, and the price of co-kurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. The option-based estimates of the prices of risk lead to reasonable values of the associated risk premia. An out-of-sample analysis of factor models with co-skewness and co......-kurtosis risk indicates that the new estimates of the price of risk improve the models performance. Models with higher-order market moments also robustly outperform standard competitors such as the CAPM and the Fama-French model....

  2. Option-Based Estimation of the Price of Co-Skewness and Co-Kurtosis Risk

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Fournier, Mathieu; Jacobs, Kris;

    -neutral second moments, and the price of co-kurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. The option-based estimates of the prices of risk lead to reasonable values of the associated risk premia. An out-of-sample analysis of factor models with co-skewness and co......-kurtosis risk indicates that the new estimates of the price of risk improve the models' performance. Models with higher-order market moments also robustly outperform standard competitors such as the CAPM and the Fama-French model....

  3. Lipoprotein(a) as a cardiovascular risk factor : current status

    NARCIS (Netherlands)

    Nordestgaard, Børge G; Chapman, M John; Ray, Kausik; Borén, Jan; Andreotti, Felicita; Watts, Gerald F; Ginsberg, Henry; Amarenco, Pierre; Catapano, Alberico; Descamps, Olivier S; Fisher, Edward; Kovanen, Petri T; Kuivenhoven, Jan Albert; Lesnik, Philippe; Masana, Luis; Reiner, Zeljko; Taskinen, Marja-Riitta; Tokgözoglu, Lale; Tybjærg-Hansen, Anne

    2010-01-01

    AIMS: The aims of the study were, first, to critically evaluate lipoprotein(a) [Lp(a)] as a cardiovascular risk factor and, second, to advise on screening for elevated plasma Lp(a), on desirable levels, and on therapeutic strategies. METHODS AND RESULTS: The robust and specific association between e

  4. Using dark current data to estimate AVIRIS noise covariance and improve spectral analyses

    Science.gov (United States)

    Boardman, Joseph W.

    1995-01-01

    Starting in 1994, all AVIRIS data distributions include a new product useful for quantification and modeling of the noise in the reported radiance data. The 'postcal' file contains approximately 100 lines of dark current data collected at the end of each data acquisition run. In essence this is a regular spectral-image cube, with 614 samples, 100 lines and 224 channels, collected with a closed shutter. Since there is no incident radiance signal, the recorded DN measure only the DC signal level and the noise in the system. Similar dark current measurements, made at the end of each line are used, with a 100 line moving average, to remove the DC signal offset. Therefore, the pixel-by-pixel fluctuations about the mean of this dark current image provide an excellent model for the additive noise that is present in AVIRIS reported radiance data. The 61,400 dark current spectra can be used to calculate the noise levels in each channel and the noise covariance matrix. Both of these noise parameters should be used to improve spectral processing techniques. Some processing techniques, such as spectral curve fitting, will benefit from a robust estimate of the channel-dependent noise levels. Other techniques, such as automated unmixing and classification, will be improved by the stable and scene-independence noise covariance estimate. Future imaging spectrometry systems should have a similar ability to record dark current data, permitting this noise characterization and modeling.

  5. Estimation methods for bioaccumulation in risk assessment of organic chemicals.

    NARCIS (Netherlands)

    Jager, D.T.; Hamers, T.

    1997-01-01

    The methodology for estimating bioaccumulation of organic chemicals is evaluated. This study is limited to three types of organisms: fish, earthworms and plants (leaf crops, root crops and grass). We propose a simple mechanistic model for estimating BCFs which performs well against measured data. To

  6. Estimation methods for bioaccumulation in risk assessment of organic chemicals

    NARCIS (Netherlands)

    Jager DT; Hamers T; ECO

    1997-01-01

    The methodology for estimating bioaccumulation of organic chemicals is evaluated. This study is limited to three types of organisms: fish, earthworms and plants (leaf crops, root crops and grass). We propose a simple mechanistic model for estimating BCFs which performs well against measured data. To

  7. Body mass index and risk of lung cancer among never, former, and current smokers.

    Science.gov (United States)

    Smith, Llewellyn; Brinton, Louise A; Spitz, Margaret R; Lam, Tram Kim; Park, Yikyung; Hollenbeck, Albert R; Freedman, Neal D; Gierach, Gretchen L

    2012-05-16

    Although obesity has been directly linked to the development of many cancers, many epidemiological studies have found that body mass index (BMI)--a surrogate marker of obesity--is inversely associated with the risk of lung cancer. These studies are difficult to interpret because of potential confounding by cigarette smoking, a major risk factor for lung cancer that is associated with lower BMI. We prospectively examined the association between BMI and the risk of lung cancer among 448 732 men and women aged 50-71 years who were recruited during 1995-1996 for the National Institutes of Health-AARP Diet and Health Study. BMI was calculated based on the participant's self-reported height and weight on the baseline questionnaire. We identified 9437 incident lung carcinomas (including 415 in never smokers) during a mean follow-up of 9.7 years through 2006. Multivariable Cox proportional hazards regression models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) with adjustment for lung cancer risk factors, including smoking status. To address potential bias due to preexisting undiagnosed disease, we excluded potentially unhealthy participants in sensitivity analyses. All statistical tests were two-sided. The crude incidence rate of lung cancer over the study follow-up period was 233 per 100 000 person-years among men and 192 per 100 000 person-years among women. BMI was inversely associated with the risk of lung cancer among both men and women (BMI ≥35 vs 22.5-24.99 kg/m(2): HR = 0.81, 95% CI = 0.70 to 0.94 and HR = 0.73, 95% CI = 0.61 to 0.87, respectively). The inverse association was restricted to current and former smokers and was stronger after adjustment for smoking. Among smokers, the inverse association persisted even after finely stratifying on smoking status, time since quitting smoking, and number of cigarettes smoked per day. Sensitivity analyses did not support the possibility that the inverse association was due to

  8. Evolving PBPK applications in regulatory risk assessment: current situation and future goals

    Science.gov (United States)

    The presentation includes current applications of PBPK modeling in regulatory risk assessment and discussions on conflicts between assuring consistency with experimental data in current situation and the desire for animal-free model development.

  9. Arsenic risk mapping in Bangladesh: a simulation technique of cokriging estimation from regional count data.

    Science.gov (United States)

    Hassan, M Manzurul; Atkins, Peter J

    2007-10-01

    Risk analysis with spatial interpolation methods from a regional database on to a continuous surface is of contemporary interest. Groundwater arsenic poisoning in Bangladesh and its impact on human health has been one of the "biggest environmental health disasters" in current years. It is ironic that so many tubewells have been installed in recent times for pathogen-free drinking water but the water pumped is often contaminated with toxic levels of arsenic. This paper seeks to analyse the spatial pattern of arsenic risk by mapping composite "problem regions" in southwest Bangladesh. It also examines the cokriging interpolation method in analysing the suitability of isopleth maps for different risk areas. GIS-based data processing and spatial analysis were used for this research, along with state-of-the-art decision-making techniques. Apart from the GIS-based buffering and overlay mapping operations, a cokriging interpolation method was adopted because of its exact interpolation capacity. The paper presents an interpolation of regional estimates of arsenic data for spatial risk mapping that overcomes the areal bias problem for administrative boundaries. Moreover, the functionality of the cokriging method demonstrates the suitability of isopleth maps that are easy to read.

  10. Fire Risk Assessment and Its Economic Loss Estimation in Tehran Subway, Applying Event Tree Analysis

    Directory of Open Access Journals (Sweden)

    Sedigheh Atrkar Roshan

    2015-01-01

    Full Text Available Subway system is one of the critical infrastructures in a society. In economic optimizations of risk control measures, valuing the loss of life and other financial losses in terms of money on the other hand, could influence the optimal investments in safety. The purpose is to contribute to the implementation of HSE in the transportation system. In this research, a fire risk assessment along with its economic loss estimation in the Direct Current (DC trains and rectifier substation (RS of Tehran subway is implemented. The number of fatalities, the extent of damage on the train equipment, etc., is then calculated in monetary unit. By using Event Tree Analysis (herein ETA, after identification of initiating events through observation, interviews, and evaluation of documents, event tree was constructed for each of them and the probability of multiple scenarios were computed. The scenario with the highest probability of fire in RS, including increased heats in the RTU panels generate a loss of at least 730 Million Rials. Accordingly, the minimum and maximum economic loss caused by fire on DC trains is minimum 510 and 1230 Million Rials, respectively. Conclusion: Given the findings of this study, the financial and human life risks, along with all tangible and intangible losses, which is considerable, the relevant managers must compare investments in safety, with the decrease of calculated economic risks as a result of fire accident in Tehran subway.

  11. Estimation of Synaptic Conductances in Presence of Nonlinear Effects Caused by Subthreshold Ionic Currents

    DEFF Research Database (Denmark)

    Vich, Catalina; Berg, Rune W.; Guillamon, Antoni

    2017-01-01

    Subthreshold fluctuations in neuronal membrane potential traces contain nonlinear components, and employing nonlinear models might improve the statistical inference. We propose a new strategy to estimate synaptic conductances, which has been tested using in silico data and applied to in vivo...... recordings. The model is constructed to capture the nonlinearities caused by subthreshold activated currents, and the estimation procedure can discern between excitatory and inhibitory conductances using only one membrane potential trace. More precisely, we perform second order approximations of biophysical...... models to capture the subthreshold nonlinearities, resulting in quadratic integrate-and-fire models, and apply approximate maximum likelihood estimation where we only suppose that conductances are stationary in a 50–100 ms time window. The results show an improvement compared to existent procedures...

  12. Angular velocity estimation based on star vector with improved current statistical model Kalman filter.

    Science.gov (United States)

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, He

    2016-11-20

    Angular velocity information is a requisite for a spacecraft guidance, navigation, and control system. In this paper, an approach for angular velocity estimation based merely on star vector measurement with an improved current statistical model Kalman filter is proposed. High-precision angular velocity estimation can be achieved under dynamic conditions. The amount of calculation is also reduced compared to a Kalman filter. Different trajectories are simulated to test this approach, and experiments with real starry sky observation are implemented for further confirmation. The estimation accuracy is proved to be better than 10-4  rad/s under various conditions. Both the simulation and the experiment demonstrate that the described approach is effective and shows an excellent performance under both static and dynamic conditions.

  13. CCSI Risk Estimation: An Application of Expert Elicitation

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Dalton, Angela C.

    2012-10-01

    The Carbon Capture Simulation Initiative (CCSI) is a multi-laboratory simulation-driven effort to develop carbon capture technologies with the goal of accelerating commercialization and adoption in the near future. One of the key CCSI technical challenges is representing and quantifying the inherent uncertainty and risks associated with developing, testing, and deploying the technology in simulated and real operational settings. To address this challenge, the CCSI Element 7 team developed a holistic risk analysis and decision-making framework. The purpose of this report is to document the CCSI Element 7 structured systematic expert elicitation to identify additional risk factors. We review the significance of and established approaches to expert elicitation, describe the CCSI risk elicitation plan and implementation strategies, and conclude by discussing the next steps and highlighting the contribution of risk elicitation toward the achievement of the overarching CCSI objectives.

  14. Current and Future Impact Risks from Small Debris to Operational Satellites

    Science.gov (United States)

    Liou, Jer-Chyi; Kessler, Don

    2011-01-01

    The collision between Iridium 33 and Cosmos 2251 in 2009 signaled the potential onset of the collision cascade effect, commonly known as the "Kessler Syndrome", in the low Earth orbit (LEO) region. Recent numerical simulations have shown that the 10 cm and larger debris population in LEO will continue to increase even with a good implementation of the commonly-adopted mitigation measures. This increase is driven by collisions involving large and massive intacts, i.e., rocket bodies and spacecraft. Therefore, active debris removal (ADR) of large and massive intacts with high collision probabilities has been argued as a direct and effective means to remediate the environment in LEO. The major risk for operational satellites in the environment, however, comes from impacts with debris just above the threshold of the protection shields. In general, these are debris in the millimeter to centimeter size regime. Although impacts by these objects are insufficient to lead to catastrophic breakup of the entire vehicle, the damage is certainly severe enough to cause critical failure of the key instruments or the entire payload. The focus of this paper is to estimate the impact risks from 5 mm and 1 cm debris to active payloads in LEO (1) in the current environment and (2) in the future environment based on different projection scenarios, including ADR. The goal of the study is to quantify the benefits of ADR in reducing debris impact risks to operational satellites.

  15. Support vector machine based estimation of remaining useful life: current research status and future trends

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Wang, Hai Kun; Li, Yan Feng; Zhang, Longlong; Liu, Zhiliang [University of Electronic Science and Technology of China, Chengdu (China)

    2015-01-15

    Estimation of remaining useful life (RUL) is helpful to manage life cycles of machines and to reduce maintenance cost. Support vector machine (SVM) is a promising algorithm for estimation of RUL because it can easily process small training sets and multi-dimensional data. Many SVM based methods have been proposed to predict RUL of some key components. We did a literature review related to SVM based RUL estimation within a decade. The references reviewed are classified into two categories: improved SVM algorithms and their applications to RUL estimation. The latter category can be further divided into two types: one, to predict the condition state in the future and then build a relationship between state and RUL; two, to establish a direct relationship between current state and RUL. However, SVM is seldom used to track the degradation process and build an accurate relationship between the current health condition state and RUL. Based on the above review and summary, this paper points out that the ability to continually improve SVM, and obtain a novel idea for RUL prediction using SVM will be future works.

  16. Non-sedating antihistamine drugs and cardiac arrhythmias : biased risk estimates from spontaneous reporting systems?

    NARCIS (Netherlands)

    De Bruin, M L; van Puijenbroek, E P; Egberts, A C G; Hoes, A W; Leufkens, H G M

    2002-01-01

    AIMS: This study used spontaneous reports of adverse events to estimate the risk for developing cardiac arrhythmias due to the systemic use of non-sedating antihistamine drugs and compared the risk estimate before and after the regulatory action to recall the over-the-counter status of some of these

  17. Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles (Final Report)

    Science.gov (United States)

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probabi...

  18. Jumps and Betas: A New Framework for Disentangling and Estimating Systematic Risks

    DEFF Research Database (Denmark)

    Todorov, Viktor; Bollerslev, Tim

    We provide a new theoretical framework for disentangling and estimating sensitivity towards systematic diffusive and jump risks in the context of factor pricing models. Our estimates of the sensitivities towards systematic risks, or betas, are based on the notion of increasingly finer sampled ret...

  19. Critical current densities estimated from AC susceptibilities in proximity-induced superconducting matrix of multifilamentary wire

    Science.gov (United States)

    Akune, Tadahiro; Sakamoto, Nobuyoshi

    2009-03-01

    In a multifilamentary wire proximity-currents between filaments show a close resemblance with the inter-grain current in a high-Tc superconductor. The critical current densities of the proximity-induced superconducting matrix Jcm can be estimated from measured twist-pitch dependence of magnetization and have been shown to follow the well-known scaling law of the pinning strength. The grained Bean model is applied on the multifilamentary wire to obtain Jcm, where the filaments are immersed in the proximity-induced superconducting matrix. Difference of the superconducting characteristics of the filament, the matrix and the filament content factor give a variety of deformation on the AC susceptibility curves. The computed AC susceptibility curves of multifilamentary wires using the grained Bean model are favorably compared with the experimental results. The values of Jcm estimated from the susceptibilities using the grained Bean model are comparable to those estimated from measured twist-pitch dependence of magnetization. The applicability of the grained Bean model on the multifilamentary wire is discussed in detail.

  20. Critical current densities estimated from AC susceptibilities in proximity-induced superconducting matrix of multifilamentary wire

    Energy Technology Data Exchange (ETDEWEB)

    Akune, Tadahiro; Sakamoto, Nobuyoshi, E-mail: akune@te.kyusan-u.ac.j [Department of Electrical Engineering and Information Technology, Kyushu Sangyo University, 2-3-1 Matsukadai, Fukuoka 813-8503 (Japan)

    2009-03-01

    In a multifilamentary wire proximity-currents between filaments show a close resemblance with the inter-grain current in a high-T{sub c} superconductor. The critical current densities of the proximity-induced superconducting matrix J{sub cm} can be estimated from measured twist-pitch dependence of magnetization and have been shown to follow the well-known scaling law of the pinning strength. The grained Bean model is applied on the multifilamentary wire to obtain J{sub cm}, where the filaments are immersed in the proximity-induced superconducting matrix. Difference of the superconducting characteristics of the filament, the matrix and the filament content factor give a variety of deformation on the AC susceptibility curves. The computed AC susceptibility curves of multifilamentary wires using the grained Bean model are favorably compared with the experimental results. The values of J{sub cm} estimated from the susceptibilities using the grained Bean model are comparable to those estimated from measured twist-pitch dependence of magnetization. The applicability of the grained Bean model on the multifilamentary wire is discussed in detail.

  1. HYDROLOGIC RISK – RESERVOIRS: AN OLD AND CURRENT IMPERATIVE

    Directory of Open Access Journals (Sweden)

    G. ROMANESCU

    2015-03-01

    Full Text Available Hydrologic risk phenomena have always affected human activity. For this reason, humankind has always sought ways of preventing and mitigating floods. Human activity has marked the environment. All human constructions change the normal pace of processes; it depends on what angle one uses to analyze the issue. Most of the times, reservoirs have complex functions, but large ones are destined mainly to preventing or mitigating hydrologic risks. The oldest reservoirs were built during Antiquity, yet the boom of such complex constructions was registered after 1950. On world level, it is worth mentioning the hydrologic basins of Columbia (USA, Chang Jiang (China, Angara (Russia, etc. The hydrographical basins in Romania comprising significant hydrotechnical works are Siret, Argeș, Jiu, Olt, Ialomița, etc. This study underscores the positive role played by reservoirs in flood mitigation and in the positive modification of the landscape.

  2. High risk bladder cancer: current management and survival

    Directory of Open Access Journals (Sweden)

    Anna M. Leliveld

    2011-04-01

    Full Text Available PURPOSE: To evaluate the pattern of care in patients with high risk non muscle invasive bladder cancer (NMIBC in the Comprehensive Cancer Center North-Netherlands (CCCN and to assess factors associated with the choice of treatment, recurrence and progression free survival rates. MATERIALS AND METHODS: Retrospective analysis of 412 patients with newly diagnosed high risk NMIBC. Clinical, demographic and follow-up data were obtained from the CCCN Cancer Registry and a detailed medical record review. Uni and multivariate analysis was performed to identify factors related to choice of treatment and 5 year recurrence and progression free survival. RESULTS: 74/412 (18% patients with high risk NMIBC underwent a transurethral resection (TUR as single treatment. Adjuvant treatment after TUR was performed in 90.7% of the patients treated in teaching hospitals versus 71.8 % in non-teaching hospitals (p 80 years OR 0.1 p = 0.001 and treatment in non-teaching hospitals (OR 0.25; p < 0.001 were associated with less adjuvant treatment after TUR. Tumor recurrence occurred in 191/392 (49% and progression in 84 /392 (21.4% patients. The mean 5-years progression free survival was 71.6% (95% CI 65.5-76.8. CONCLUSION: In this pattern of care study in high risk NMIBC, 18% of the patients were treated with TUR as single treatment. Age and treatment in non-teaching hospitals were associated with less adjuvant treatment after TUR. None of the variables sex, age, comorbidity, hospital type, stage and year of treatment was associated with 5 year recurrence or progression rates.

  3. Risks in Project Finance Initiatives: Current Trends and Future Directions

    OpenAIRE

    Mangano, Giulio

    2014-01-01

    This thesis is an analysis of Public Private Partnership (PPP). PPP refers to the provision of public assets and service through the participation of the government, the private sector and the consumers. The purpose is to analyze the main risks involved in a PPP initiative and to understand how they affect its capital structure. To this aim, different datasets have been analyzed in order to trace consistent and coherent lessons. After that, this thesis aims at proposing PPP models for innovat...

  4. Policy Risk of Current Bank Bailouts in China

    Institute of Scientific and Technical Information of China (English)

    XianxinZhao

    2004-01-01

    China's banking risk is mainly driven by "moral hazard", the inherent deficiency of state ownership. The ongoing reform strategy for state-owned banks, adopted by the government, mainly aims at this target, but fails to take a correct path. Since the government still holds the controlling right of the banks, there is no evidence to show that recapitalization and initial public offering (IPO) will lead to sound practices for banking governance. Furthermore, in order to accelerate the recapitalization process, the reformers have injected a large amount of foreign exchange reserves into the state-owned banks, which consequently expands money supply and will lead to instability of future economic growth. Our conclusion is that there is a latent banking risk. China's banking reform should be in line with the external environment and the overall economic reform process, and the reformers should always keep in mind that sustainability of future economic growth is the ultimate means by which banking risk can be cushioned and absorbed.

  5. Population-based estimate of prostate cancer risk for carriers of the HOXB13 missense mutation G84E.

    Directory of Open Access Journals (Sweden)

    Robert J MacInnis

    Full Text Available The HOXB13 missense mutation G84E (rs138213197 is associated with increased risk of prostate cancer, but the current estimate of increased risk has a wide confidence interval (width of 95% confidence interval (CI >200-fold so the point estimate of 20-fold increased risk could be misleading. Population-based family studies can be more informative for estimating risks for rare variants, therefore, we screened for mutations in an Australian population-based series of early-onset prostate cancer cases (probands. We found that 19 of 1,384 (1.4% probands carried the missense mutation, and of these, six (32% had a family history of prostate cancer. We tested the 22 relatives of carriers diagnosed from 1998 to 2008 for whom we had a DNA sample, and found seven more carriers and one obligate carrier. The age-specific incidence for carriers was estimated to be, on average, 16.4 (95% CI 2.5-107.2 times that for the population over the time frame when the relatives were at risk prior to baseline. We then estimated the age and birth year- specific cumulative risk of prostate cancer (penetrance for carriers. For example, the penetrance for an unaffected male carrier born in 1950 was 19% (95% CI 5-46% at age 60 years, 44% (95% CI 18-74% at age 70 years and 60% (95% CI 30-85% at age 80 years. Our study has provided a population-based estimate of the average risk of prostate cancer for HOXB13 missense mutation G84E carriers that can be used to guide clinical practice and research. This study has also shown that the majority of hereditary prostate cancers due to the HOXB13 missense mutation are 'sporadic' in the sense that unselected cases with the missense mutation do not typically report having a family history of prostate cancer.

  6. Interpreting the epidemiological evidence linking obesity and cancer: A framework for population-attributable risk estimations in Europe.

    Science.gov (United States)

    Renehan, Andrew G; Soerjomataram, Isabelle; Leitzmann, Michael F

    2010-09-01

    Standard approaches to estimating population-attributable risk (PAR) include modelling estimates of exposure prevalence and relative risk. Here, we examine the associations between body mass index (BMI) and cancer risk and how effect modifications of these associations impact on PAR estimates. In 2008, sex- and population-specific risk estimates were determined for associations with BMI in a standardised meta-analysis for 20 cancer types. Since then, refinements of these estimates have emerged: (i) absence of menopausal hormonal therapy (MHT) is associated with elevated BMI associations in post-menopausal breast, endometrial and ovarian cancers; (ii) current smoking attenuates the BMI associations in oesophageal squamous cell carcinoma, lung and pancreatic cancers; (iii) prostate screening attenuates BMI associations when all prostate cancers are considered together; and (iv) BMI is differentially associated with different histological subtypes within the same cancer group. Using secondary analyses of the aforementioned meta-analysis, we show 2-3-fold shifts in PAR estimations for breast and endometrial cancers depending on the MHT usage in European countries. We also critically examine how to best handle exposures (in this example, BMI distributions) and relative risk estimates in PAR models, and argue in favour of a counterfactual approach based around BMI means. From these observations, we develop a research framework in which to optimally evaluate future trends in numbers of new cancers attributable to excess BMI. Overall, this framework gives conservative estimates for PAR - nonetheless, the numbers of avoidable cancers across Europe through avoidance of excess weight are substantial. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Measurement error affects risk estimates for recruitment to the Hudson River stock of striped bass.

    Science.gov (United States)

    Dunning, Dennis J; Ross, Quentin E; Munch, Stephan B; Ginzburg, Lev R

    2002-06-07

    We examined the consequences of ignoring the distinction between measurement error and natural variability in an assessment of risk to the Hudson River stock of striped bass posed by entrainment at the Bowline Point, Indian Point, and Roseton power plants. Risk was defined as the probability that recruitment of age-1+ striped bass would decline by 80% or more, relative to the equilibrium value, at least once during the time periods examined (1, 5, 10, and 15 years). Measurement error, estimated using two abundance indices from independent beach seine surveys conducted on the Hudson River, accounted for 50% of the variability in one index and 56% of the variability in the other. If a measurement error of 50% was ignored and all of the variability in abundance was attributed to natural causes, the risk that recruitment of age-1+ striped bass would decline by 80% or more after 15 years was 0.308 at the current level of entrainment mortality (11%). However, the risk decreased almost tenfold (0.032) if a measurement error of 50% was considered. The change in risk attributable to decreasing the entrainment mortality rate from 11 to 0% was very small (0.009) and similar in magnitude to the change in risk associated with an action proposed in Amendment #5 to the Interstate Fishery Management Plan for Atlantic striped bass (0.006)--an increase in the instantaneous fishing mortality rate from 0.33 to 0.4. The proposed increase in fishing mortality was not considered an adverse environmental impact, which suggests that potentially costly efforts to reduce entrainment mortality on the Hudson River stock of striped bass are not warranted.

  8. Measurement Error Affects Risk Estimates for Recruitment to the Hudson River Stock of Striped Bass

    Directory of Open Access Journals (Sweden)

    Dennis J. Dunning

    2002-01-01

    Full Text Available We examined the consequences of ignoring the distinction between measurement error and natural variability in an assessment of risk to the Hudson River stock of striped bass posed by entrainment at the Bowline Point, Indian Point, and Roseton power plants. Risk was defined as the probability that recruitment of age-1+ striped bass would decline by 80% or more, relative to the equilibrium value, at least once during the time periods examined (1, 5, 10, and 15 years. Measurement error, estimated using two abundance indices from independent beach seine surveys conducted on the Hudson River, accounted for 50% of the variability in one index and 56% of the variability in the other. If a measurement error of 50% was ignored and all of the variability in abundance was attributed to natural causes, the risk that recruitment of age-1+ striped bass would decline by 80% or more after 15 years was 0.308 at the current level of entrainment mortality (11%. However, the risk decreased almost tenfold (0.032 if a measurement error of 50% was considered. The change in risk attributable to decreasing the entrainment mortality rate from 11 to 0% was very small (0.009 and similar in magnitude to the change in risk associated with an action proposed in Amendment #5 to the Interstate Fishery Management Plan for Atlantic striped bass (0.006— an increase in the instantaneous fishing mortality rate from 0.33 to 0.4. The proposed increase in fishing mortality was not considered an adverse environmental impact, which suggests that potentially costly efforts to reduce entrainment mortality on the Hudson River stock of striped bass are not warranted.

  9. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    Science.gov (United States)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information

  10. Estimation of the yield of poplars in plantations of fast-growing species within current results

    Directory of Open Access Journals (Sweden)

    Martin Fajman

    2009-01-01

    Full Text Available Current results are presented of allometric yield estimates of the poplar short rotation coppice. According to a literature review it is obvious that yield estimates, based on measurable quantities of a growing stand, depend not only on the selected tree specie or its clone, but also on the site location. The Jap-105 poplar clone (P. nigra x P. maximowiczii allometric relations were analyzed by regression methods aimed at the creation of the yield estimation methodology at a testing site in Domanínek. Altogether, the twelve polynomial dependences of particular measured quantities approved the high empirical data conformity with the tested regression model (correlation index from 0.9033 to 0.9967. Within the forward stepwise regression, factors were selected, which explain best examined estimates of the total biomass DM; i.e. d.b.h. and stem height. Furthermore, the KESTEMONT’s (1971 mo­del was verified with a satisfying conformity as well. Approving presented yield estimation methods, the presented models will be checked in a large-scale field trial.

  11. Industrial fuel gas demonstration plant program. Current working estimate. Phase III and III

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    The United States Department of Energy (DOE) executed a contract with Memphis Light, Gas and Water Division (MLGW) which requires MLGW to perform process analysis, design, procurement, construction, testing, operation, and evaluation of a plant which will demonstrate the feasibility of converting high sulfur bituminous coal to industrial fuel gas with a heating value of 300 +- 30 Btu per standard cubic foot (SCF). The demonstration plant is based on the U-Gas process, and its product gas is to be used in commercial applications in Memphis, Tenn. The contract specifies that the work is to be conducted in three phases. The Phases are: Phase I - Program Development and Conceptual Design; Phase II - Demonstration Plant Final Design, Procurement and Construction; and Phase III - Demonstration Plant Operation. Under Task III of Phase I, a Cost Estimate for the Demonstration Plant was completed as well as estimates for other Phase II and III work. The output of this Estimate is presented in this volume. This Current Working Estimate for Phases II and III is based on the Process and Mechanical Designs presented in the Task II report (second issue) and the 12 volumes of the Task III report. In addition, the capital cost estimate summarized in the appendix has been used in the Economic Analysis (Task III) Report.

  12. Estimation of suspended sediment flux from acoustic Doppler current profiling along the Jinhae Bay entrance

    Institute of Scientific and Technical Information of China (English)

    WANG Yaping; CHU Yong Shik; LEE Hee Jun; HAN Choong Keun; OH Byung Chul

    2005-01-01

    A Nortek acoustic Doppler current profiler (NDP) was installed on a moving vessel to survey the entrance to the Jinhae Bay on August 22~23, 2001. The current velocity and acoustic backscattering signal were collected along two cross-sections; water samples were also collected during the measurement. The acoustic signals were normalized to compensate for the loss incurred by acoustic beam spreading in the seawater. The in situ calibration shows that a significant relationship is present between suspended sediment concentrations (SSC) and normalized acoustic signals. Two acoustic parameters have been determined to construct an acoustic-concentration model.Using this derived model, the SSC patterns along the surveyed cross-sections were obtained by the conversion of acoustic data. Using the current velocity and SSC data, the flux of suspended sediment was estimated. It indicates that the sediment transport into the bay through the entrance has an order of magnitude of 100 t per tidal cycle.

  13. How are flood risk estimates affected by the choice of return-periods?

    Directory of Open Access Journals (Sweden)

    P. J. Ward

    2011-12-01

    Full Text Available Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods, to plot these on an exceedance probability-loss curve (risk curve and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.. We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D–3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.

  14. Estimating the subjective risks of driving simulator accidents.

    Science.gov (United States)

    Dixit, Vinayak; Harrison, Glenn W; Rutström, E Elisabet

    2014-01-01

    We examine the subjective risks of driving behavior using a controlled virtual reality experiment. Use of a driving simulator allows us to observe choices over risky alternatives that are presented to the individual in a naturalistic manner, with many of the cues one would find in the field. However, the use of a simulator allows us the type of controls one expects from a laboratory environment. The subject was tasked with making a left-hand turn into incoming traffic, and the experimenter controlled the headways of oncoming traffic. Subjects were rewarded for making a successful turn, and lost income if they crashed. The experimental design provided opportunities for subjects to develop subjective beliefs about when it would be safe to turn, and it also elicited their attitudes towards risk. A simple structural model explains behavior, and showed evidence of heterogeneity in both the subjective beliefs that subjects formed and their risk attitudes. We find that subjective beliefs change with experience in the task and the driver's skill. A significant difference was observed in the perceived probability to successfully turn among the inexperienced drivers who did and did not crash even though there was no significant difference in drivers' risk attitudes among the two groups. We use experimental economics to design controlled, incentive compatible tasks that provide an opportunity to evaluate the impact on driver safety of subject's subjective beliefs about when it would be safe to turn as well as their attitudes towards risk. This method could be used to help insurance companies determine risk premia associated with risk attitudes or beliefs of crashing, to better incentivize safe driving.

  15. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    Science.gov (United States)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-08-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  16. Eddy Current Inversion Models for Estimating Dimensions of Defects in Multilayered Structures

    Directory of Open Access Journals (Sweden)

    Bo Ye

    2014-01-01

    Full Text Available In eddy current nondestructive evaluation, one of the principal challenges is to determine the dimensions of defects in multilayered structures from the measured signals. It is a typical inverse problem which is generally considered to be nonlinear and ill-posed. In the paper, two effective approaches have been proposed to estimate the defect dimensions. The first one is a partial least squares (PLS regression method. The second one is a kernel partial least squares (KPLS regression method. The experimental research is carried out. In experiments, the eddy current signals responding to magnetic field changes are detected by a giant magnetoresistive (GMR sensor and preprocessed for noise elimination using a wavelet packet analysis (WPA method. Then, the proposed two approaches are used to construct the inversion models of defect dimension estimation. Finally, the estimation results are analyzed. The performance comparison between the proposed two approaches and the artificial neural network (ANN method is presented. The comparison results demonstrate the feasibility and validity of the proposed two methods. Between them, the KPLS regression method gives a better prediction performance than the PLS regression method at present.

  17. Time evolving bed shear stress due the passage of gravity currents estimated with ADVP velocity measurements

    Science.gov (United States)

    Zordan, Jessica; Schleiss, Anton J.; Franca, Mário J.

    2016-04-01

    Density or gravity currents are geophysical flows driven by density gradients between two contacting fluids. The physical trigger mechanism of these phenomena lays in the density differences which may be caused by differences in the temperature, dissolved substances or concentration of suspended sediments. Saline density currents are capable to entrain bed sediments inducing signatures in the bottom of sedimentary basins. Herein, saline density currents are reproduced in laboratory over a movable bed. The experimental channel is of the lock-exchange type, it is 7.5 m long and 0.3 m wide, divided into two sections of comparable volumes by a sliding gate. An upstream reach serves as a head tank for the dense mixture; the current propagates through a downstream reach where the main measurements are made. Downstream of the channel a tank exist to absorb the reflection of the current and thus artifacts due to the limited length of the channel. High performance thermoplastic polyurethane simulating fine sediments forms the movable bed. Measures of 3D instantaneous velocities will be made with the use of the non-intrusive technique of the ADV (Acoustic Doppler Current Profiler). With the velocity measurements, the evolution in time of the channel-bed shear stress due the passage of gravity currents is estimated. This is in turn related to the observed erosion and to such parameters determinant for the dynamics of the current as initial density difference, lock length and channel slope. This work was funded by the ITN-Programme (Marie Curie Actions) of the European Union's Seventh Framework Programme FP7-PEOPLE-2013-ITN under REA grant agreement n_607394-SEDITRANS.

  18. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    Science.gov (United States)

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  19. The impact of population heterogeneity on risk estimation in genetic counseling

    Directory of Open Access Journals (Sweden)

    Shaffer Michele L

    2004-06-01

    Full Text Available Abstract Background Genetic counseling has been an important tool for evaluating and communicating disease susceptibility for decades, and it has been applied to predict risks for a wide class of hereditary disorders. Most diseases are complex in nature and are affected by multiple genes and environmental conditions; it is highly likely that DNA tests alone do not define all the genetic factors responsible for a disease, so that persons classified into the same risk group by DNA testing actually could have different disease susceptibilities. Ignorance of population heterogeneity may lead to biased risk estimates, whereas additional information on population heterogeneity may improve the precision of such estimates. Methods Although DNA tests are widely used, few studies have investigated the accuracy of the predicted risks. We examined the impact of population heterogeneity on predicted disease risks by simulation of three different heterogeneity scenarios and studied the precision and accuracy of the risks estimated from a logistic regression model that ignored population heterogeneity. Moreover, we also incorporated information about population heterogeneity into our original model and investigated the resulting improvement in the accuracy of risk estimation. Results We found that heterogeneity in one or more categories could lead to biased estimates not only in the "contaminated" categories but also in other homogeneous categories. Incorporating information about population heterogeneity into the original model greatly improved the accuracy of risk estimation. Conclusions Our findings imply that without thorough knowledge about genetic basis of the disease, risks estimated from DNA tests may be misleading. Caution should be taken when evaluating the predicted risks obtained from genetic counseling. On the other hand, the improved accuracy of risk estimates after incorporating population heterogeneity information into the model did point out a

  20. Probabilistic methodology for estimating radiation-induced cancer risk

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  1. Estimation of Uncertainty in Risk Assessment of Hydrogen Applications

    DEFF Research Database (Denmark)

    Markert, Frank; Krymsky, V.; Kozine, Igor

    2011-01-01

    the permitting authorities request qualitative and quantitative risk assessments (QRA) to show the safety and acceptability in terms of failure frequencies and respective consequences. For new technologies not all statistical data might be established or are available in good quality causing assumptions......Hydrogen technologies such as hydrogen fuelled vehicles and refuelling stations are being tested in practice in a number of projects (e.g. HyFleet-Cute and Whistler project) giving valuable information on the reliability and maintenance requirements. In order to establish refuelling stations...... probability and the NUSAP concept to quantify uncertainties of new not fully qualified hydrogen technologies and implications to risk management....

  2. Estimation of population firing rates and current source densities from laminar electrode recordings.

    Science.gov (United States)

    Pettersen, Klas H; Hagen, Espen; Einevoll, Gaute T

    2008-06-01

    This model study investigates the validity of methods used to interpret linear (laminar) multielectrode recordings. In computer experiments extracellular potentials from a synaptically activated population of about 1,000 pyramidal neurons are calculated using biologically realistic compartmental neuron models combined with electrostatic forward modeling. The somas of the pyramidal neurons are located in a 0.4 mm high and wide columnar cylinder, mimicking a stimulus-evoked layer-5 population in a neocortical column. Current-source density (CSD) analysis of the low-frequency part (estimates of the true underlying CSD. The high-frequency part (>750 Hz) of the potentials (multi-unit activity, MUA) is found to scale approximately as the population firing rate to the power 3/4 and to give excellent estimates of the underlying population firing rate for trial-averaged data. The MUA signal is found to decay much more sharply outside the columnar populations than the LFP.

  3. Current industrial practice of managing risks in product development project portfolios

    DEFF Research Database (Denmark)

    Weng, R.; Oehmen, Josef; Ben-Daya, M.

    2013-01-01

    Managing portfolios of development and engineering projects currently presents significant challenges to companies. This is even more the case in the management of portfolio risks, where both industry and academia currently lack a clear conceptual understanding of what portfolio risks are and what...... influences them. The objective of this paper is two-fold: First, based on a literature review and industry focus group discussions, we introduce a new model for describing portfolio-level risks. It consists of three types of risks (escalated risks, common cause risks, and cascading risks) based on 9 types...... of interdependencies in PD project portfolios (Technology, Budget, Objectives and Requirements, Infrastructure and Equipment, Skillset and Human Resources, Process and Schedule, Supplier, Legal and Regulatory, and finally Market and Customer). Second, we investigate how risk management on the portfolio level...

  4. Insulin Sensitivity and Mortality Risk Estimation in Patients with Type ...

    African Journals Online (AJOL)

    2017-06-28

    Jun 28, 2017 ... Methods: Fasting plasma glucose, total cholesterol, high‑density lipoprotein cholesterol ... A. BSTR. A. CT. KEYWORDS: Insulin resistance, mortality risk, type 2 diabetes mellitus, urinary ... Patients with type 1 DM, gestational DM, and T2DM who were ..... over production of free fatty acids (FFA) in the plasma.

  5. Current modeling practice may lead to falsely high benchmark dose estimates.

    Science.gov (United States)

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment.

  6. On the estimation of population-specific synaptic currents from laminar multielectrode recordings.

    Science.gov (United States)

    Gratiy, Sergey L; Devor, Anna; Einevoll, Gaute T; Dale, Anders M

    2011-01-01

    Multielectrode array recordings of extracellular electrical field potentials along the depth axis of the cerebral cortex are gaining popularity as an approach for investigating the activity of cortical neuronal circuits. The low-frequency band of extracellular potential, i.e., the local field potential (LFP), is assumed to reflect synaptic activity and can be used to extract the laminar current source density (CSD) profile. However, physiological interpretation of the CSD profile is uncertain because it does not disambiguate synaptic inputs from passive return currents and does not identify population-specific contributions to the signal. These limitations prevent interpretation of the CSD in terms of synaptic functional connectivity in the columnar microcircuit. Here we present a novel anatomically informed model for decomposing the LFP signal into population-specific contributions and for estimating the corresponding activated synaptic projections. This involves a linear forward model, which predicts the population-specific laminar LFP in response to synaptic inputs applied at different positions along each population and a linear inverse model, which reconstructs laminar profiles of synaptic inputs from laminar LFP data based on the forward model. Assuming spatially smooth synaptic inputs within individual populations, the model decomposes the columnar LFP into population-specific contributions and estimates the corresponding laminar profiles of synaptic input as a function of time. It should be noted that constant synaptic currents at all positions along a neuronal population cannot be reconstructed, as this does not result in a change in extracellular potential. However, constraining the solution using a priori knowledge of the spatial distribution of synaptic connectivity provides the further advantage of estimating the strength of active synaptic projections from the columnar LFP profile thus fully specifying synaptic inputs.

  7. Estimation of seismic quality factor: Artificial neural networks and current approaches

    Science.gov (United States)

    Yıldırım, Eray; Saatçılar, Ruhi; Ergintav, Semih

    2017-01-01

    The aims of this study are to estimate soil attenuation using alternatives to traditional methods, to compare results of using these methods, and to examine soil properties using the estimated results. The performances of all methods, amplitude decay, spectral ratio, Wiener filter, and artificial neural network (ANN) methods, are examined on field and synthetic data with noise and without noise. High-resolution seismic reflection field data from Yeniköy (Arnavutköy, İstanbul) was used as field data, and 424 estimations of Q values were made for each method (1,696 total). While statistical tests on synthetic and field data are quite close to the Q value estimation results of ANN, Wiener filter, and spectral ratio methods, the amplitude decay methods showed a higher estimation error. According to previous geological and geophysical studies in this area, the soil is water-saturated, quite weak, consisting of clay and sandy units, and, because of current and past landslides in the study area and its vicinity, researchers reported heterogeneity in the soil. Under the same physical conditions, Q value calculated on field data can be expected to be 7.9 and 13.6. ANN models with various structures, training algorithm, input, and number of neurons are investigated. A total of 480 ANN models were generated consisting of 60 models for noise-free synthetic data, 360 models for different noise content synthetic data and 60 models to apply to the data collected in the field. The models were tested to determine the most appropriate structure and training algorithm. In the final ANN, the input vectors consisted of the difference of the width, energy, and distance of seismic traces, and the output was Q value. Success rate of both ANN methods with noise-free and noisy synthetic data were higher than the other three methods. Also according to the statistical tests on estimated Q value from field data, the method showed results that are more suitable. The Q value can be estimated

  8. Estimation of the Plant Time Constant of Current-Controlled Voltage Source Converters

    DEFF Research Database (Denmark)

    Vidal, Ana; Yepes, Alejandro G.; Malvar, Jano

    2014-01-01

    Precise knowledge of the plant time constant is essential to perform a thorough analysis of the current control loop in voltage source converters (VSCs). As the loop behavior can be significantly influenced by the VSC working conditions, the effects associated to converter losses should be included...... of the VSC interface filter measured at rated conditions. This paper extends that method so that both parameters of the plant time constant (resistance and inductance) are estimated. Such enhancement is achieved through the evaluation of the closed-loop transient responses of both axes of the synchronous...

  9. Estimating the North Atlantic mean dynamic topography and geostrophic currents with GOCE

    DEFF Research Database (Denmark)

    Bingham, Rory J.; Knudsen, Per; Andersen, Ole Baltazar

    2011-01-01

    be derived from them. Because the high degree commission errors of all of the GOCE models are lower than those from the best satellite only GRACE solution, all of the derived GOCE MDTs are much less noisy than the GRACE MDT They therefore require less severe filtering and, as a consequence, the strength...... of the currents calculated from them are in better agreement with those from an in-situ drifter based estimate. Where the comparison is possible, the reduction in MDT noise from the first to second releases is also clear. However, given that some filtering is still required, this translates into only a small...

  10. Macro-economic assessment of flood risk in Italy under current and future climate

    Science.gov (United States)

    Carrera, Lorenzo; Koks, Elco; Mysiak, Jaroslav; Aerts, Jeroen; Standardi, Gabriele

    2014-05-01

    This paper explores an integrated methodology for assessing direct and indirect costs of fluvial flooding to estimate current and future fluvial flood risk in Italy. Our methodology combines a Geographic Information System spatial approach, with a general economic equilibrium approach using a downscaled modified version of a Computable General Equilibrium model at NUTS2 scale. Given the level of uncertainty in the behavior of disaster-affected economies, the simulation considers a wide range of business recovery periods. We calculate expected annual losses for each NUTS2 region, and exceedence probability curves to determine probable maximum losses. Given a certain acceptable level of risk, we describe the conditions of flood protection and business recovery periods under which losses are contained within this limit. Because of the difference between direct costs, which are an overestimation of stock losses, and indirect costs, which represent the macro-economic effects, our results have different policy meanings. While the former is relevant for post-disaster recovery, the latter is more relevant for public policy issues, particularly for cost-benefit analysis and resilience assessment.

  11. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    Science.gov (United States)

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  12. A Bayesian Hierarchical Modeling Scheme for Estimating Erosion Rates Under Current Climate Conditions

    Science.gov (United States)

    Lowman, L.; Barros, A. P.

    2014-12-01

    Computational modeling of surface erosion processes is inherently difficult because of the four-dimensional nature of the problem and the multiple temporal and spatial scales that govern individual mechanisms. Landscapes are modified via surface and fluvial erosion and exhumation, each of which takes place over a range of time scales. Traditional field measurements of erosion/exhumation rates are scale dependent, often valid for a single point-wise location or averaging over large aerial extents and periods with intense and mild erosion. We present a method of remotely estimating erosion rates using a Bayesian hierarchical model based upon the stream power erosion law (SPEL). A Bayesian approach allows for estimating erosion rates using the deterministic relationship given by the SPEL and data on channel slopes and precipitation at the basin and sub-basin scale. The spatial scale associated with this framework is the elevation class, where each class is characterized by distinct morphologic behavior observed through different modes in the distribution of basin outlet elevations. Interestingly, the distributions of first-order outlets are similar in shape and extent to the distribution of precipitation events (i.e. individual storms) over a 14-year period between 1998-2011. We demonstrate an application of the Bayesian hierarchical modeling framework for five basins and one intermontane basin located in the central Andes between 5S and 20S. Using remotely sensed data of current annual precipitation rates from the Tropical Rainfall Measuring Mission (TRMM) and topography from a high resolution (3 arc-seconds) digital elevation map (DEM), our erosion rate estimates are consistent with decadal-scale estimates based on landslide mapping and sediment flux observations and 1-2 orders of magnitude larger than most millennial and million year timescale estimates from thermochronology and cosmogenic nuclides.

  13. Risk stratification in multiple myeloma, part 2: the significance of genetic risk factors in the era of currently available therapies.

    Science.gov (United States)

    Biran, Noa; Jagannath, Sundar; Chari, Ajai

    2013-01-01

    Multiple myeloma (MM) is a heterogeneous disease, and a variety of risk factors at the time of initial diagnosis can be used to stratify patients. In the first part of this 2-part series, we reviewed the currently identified prognostic factors, characterized by disease burden, host factors, tumor biology, and depth of response to therapy. However, these risk factors cannot be interpreted independently of therapies. Novel therapies have the potential to worsen or improve outcomes compared with conventional therapy in high-risk patients, or actually overcome the high-risk status, thereby resulting in reclassification as standard risk. For example, thalidomide (Thalomid, Celgene) is associated with worse outcomes in patients with high-risk cytogenetic abnormalities, such as deletion of chromosomes 13 and 17p, whereas proteasome inhibitors appear to overcome t(4;14). The second part of this series reviews the significance of various genetic risks in the era of novel therapies for MM.

  14. How to estimate epidemic risk from incomplete contact diaries data?

    CERN Document Server

    Mastrandrea, Rossana

    2016-01-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations {with for instance an enhanced sensitivity to initial conditions}. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact netw...

  15. Estimation of Employee Turnover with Competing Risks Models

    Directory of Open Access Journals (Sweden)

    Grzenda Wioletta

    2015-12-01

    Full Text Available Employee turnover accompanies every business organization, regardless of the industry and size. Nowadays, many companies struggle with problems related to the lack of sufficient information about the nature of employee turnover processes. Therefore, comprehensive analysis of these processes is necessary. This article aims to examine the turnover of employees from a big manufacturing company using competing risks models with covariates and without covariates. This technique allows to incorporate the information about the type of employment contract termination. Moreover, Cox proportional hazard model enables the researcher to analyse simultaneously multiple factors that affect employment duration. One of the major observations is that employee remuneration level differentiates most strongly the risk of job resignation.

  16. Current Approaches to the Establishment of Credit Risk Specific Provisions

    Directory of Open Access Journals (Sweden)

    Ion Nitu

    2008-10-01

    Full Text Available The aim of the new Basel II and IFRS approaches is to make the operations of financial institutions more transparent and thus to create a better basis for the market participants and supervisory authorities to acquire information and make decisions. In the banking sector, a continuous debate is being led, related to the similarities and differences between IFRS approach on loan loss provisions and Basel II approach on calculating the capital requirements, judging against the classical method regarding loan provisions, currently used by the Romanian banks following the Central Bank’s regulations.Banks must take into consideration that IFRS and Basel II objectives are fundamentally different. While IFRS aims to ensure that the financial papers reflect adequately the losses recorded at each balance sheet date, the Basel II objective is to ensure that the bank has enough provisions or capital in order to face expected losses in the next 12 months and eventual unexpected losses.Consequently, there are clear differences between the objectives of the two models. Basel II works on statistical modeling of expected losses while IFRS, although allowing statistical models, requires a trigger event to have occurred before they can be used. IAS 39 specifically states that losses that are expected as a result of future events, no matter how likely, are not recognized. This is a clear and fundamental area of difference between the two frameworks.

  17. The effects of neutrons in Hiroshima. Implications for the risk estimates.

    Science.gov (United States)

    Kellerer, A M

    1999-01-01

    Risk estimates for radiation-induced late effects are relevant to various considerations in radiation protection. Most of these considerations relate to small doses for which no excess risk can be seen even in extensive epidemiological studies. Risk coefficients for radiation protection must, therefore, be based on uncertain extrapolation of observations obtained at moderate or high doses. The extrapolation can not be replaced, as yet, by new, more direct information on processes such as radiation-induced genetic instability or adaptive response. While the new findings indicate complexities that may be highly relevant to the effectiveness- or lack of effectiveness- of radiation at low doses, they remain insufficiently understood to permit a decision as to whether dose-effect relations are linear, curvilinear, or have a threshold in dose. In view of these uncertainties radiation-protection regulations are, today, based on the conservative assumption of a linear dose dependence without threshold. This approach assures a sufficient degree of protection, but it may become unreasonably over-conservative, when the cautious hypothesis is treated as proven fact, and when-in addition-the assumed initial slope of the dose relation is not critically evaluated. A reliable evaluation needs to be based on the follow-up of the atom-bomb bomb survivors, and several major aspects of current interest are discussed here. a) Mortality from solid tumours in Hiroshima shows a statistically significant excess at a colon dose of 50 mGy; however, it is likely that this is the result of a bias in assigning causes of death. b) The solid cancer mortality data of the atom-bomb survivors are consistent with linearity in dose, but they can be shown to be equally consistent with a considerable degree of curvature. c) Even with the present dosimetry system, DS86, a substantial part of the effect at small doses in Hiroshima could be due to neutrons. If this is the case, the risk estimates for gamma

  18. Intermediate-Depth Currents Estimated From Float Measurements in the Gulf of Mexico

    Science.gov (United States)

    Weatherly, G. L.; Wienders, N.; Romano, A.

    2005-05-01

    Data from 17 PALACE floats set in the Gulf of Mexico sampling the intermediate-depth (~ 900 db) flow from April 1998 to February 2002 indicate a mean cyclonic circulation along the northern, western and southwestern edges of the Gulf of Mexico. This flow intensified into a ~ 0.10 m/s current in the western and southern Bay of Campeche and was deflected around a topographic feature, called here the Campeche Bay Bump, in the southern Bay of Campeche. Associated with this intensified flow was a small cyclonic gyre in the southwestern Bay of Campeche. Floats launched in the eastern Gulf of Mexico tended to stay there and those launched in the western Gulf of Mexico tended to stay in the western Gulf of Mexico suggesting restricted connection at depth between the eastern and western Gulf of Mexico. The current estimates made neglecting non-900 db depth drifts before first-surface fix and drifts after last-surface fix were 10% larger than those which took into account these drifts. Most of this (8%) was due to neglect of the surface drift before first and after last fix. Except for stronger flow below the Loop Current and Loop Current warm-core rings, no other pattern was seen between the intermediate depth flow and the surface flow.

  19. Dust concentrations and respiratory risks in coalminers: key risk estimates from the British Pneumoconiosis Field Research

    Energy Technology Data Exchange (ETDEWEB)

    Soutar, C.A.; Hurley, J.F.; Miller, B.G.; Cowie, H.A.; Buchanan, D. [Inst. of Occupational Medicine, Edinburgh (United Kingdom)

    2004-06-01

    To help inform the setting of dust control standards in coal mines, this brief review summarises the most recent and reliable exposure-response relations, for damaging respiratory effects, derived from the Pneumoconiosis Field Research (PFR). Collecting data over 38 years in the British coal industry, this was a programme of prospective research on the respiratory health of coal miners, characterised by regular health surveys and detailed measurements of dust and silica concentrations in the workplace. Exposure-response relations are presented for coal workers' simple pneumoconiosis category II, progressive massive fibrosis, defined deficits of lung function (FEV1), and category II silicosis. This simplified overview provides a guide to the most recent and most reliable estimates from the PFR of dust-related risks of substantial pulmonary disease, and to the magnitude of the effects. Control of dust sufficient to prevent category II simple pneumoconiosis should prevent most cases of progressive massive fibrosis and most dust related large lung function deficits. Where the dust contains high proportions of silica, control to low levels is essential, and even quite brief excursions of silica to high levels must be avoided.

  20. Multiple imputation for handling missing outcome data when estimating the relative risk.

    Science.gov (United States)

    Sullivan, Thomas R; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-09-06

    Multiple imputation is a popular approach to handling missing data in medical research, yet little is known about its applicability for estimating the relative risk. Standard methods for imputing incomplete binary outcomes involve logistic regression or an assumption of multivariate normality, whereas relative risks are typically estimated using log binomial models. It is unclear whether misspecification of the imputation model in this setting could lead to biased parameter estimates. Using simulated data, we evaluated the performance of multiple imputation for handling missing data prior to estimating adjusted relative risks from a correctly specified multivariable log binomial model. We considered an arbitrary pattern of missing data in both outcome and exposure variables, with missing data induced under missing at random mechanisms. Focusing on standard model-based methods of multiple imputation, missing data were imputed using multivariate normal imputation or fully conditional specification with a logistic imputation model for the outcome. Multivariate normal imputation performed poorly in the simulation study, consistently producing estimates of the relative risk that were biased towards the null. Despite outperforming multivariate normal imputation, fully conditional specification also produced somewhat biased estimates, with greater bias observed for higher outcome prevalences and larger relative risks. Deleting imputed outcomes from analysis datasets did not improve the performance of fully conditional specification. Both multivariate normal imputation and fully conditional specification produced biased estimates of the relative risk, presumably since both use a misspecified imputation model. Based on simulation results, we recommend researchers use fully conditional specification rather than multivariate normal imputation and retain imputed outcomes in the analysis when estimating relative risks. However fully conditional specification is not without its

  1. Optical Estimation of Depth and Current in a Ebb Tidal Delta Environment

    Science.gov (United States)

    Holman, R. A.; Stanley, J.

    2012-12-01

    A key limitation to our ability to make nearshore environmental predictions is the difficulty of obtaining up-to-date bathymetry measurements at a reasonable cost and frequency. Due to the high cost and complex logistics of in-situ methods, research into remote sensing approaches has been steady and has finally yielded fairly robust methods like the cBathy algorithm for optical Argus data that show good performance on simple barred beach profiles and near immunity to noise and signal problems. In May, 2012, data were collected in a more complex ebb tidal delta environment during the RIVET field experiment at New River Inlet, NC. The presence of strong reversing tidal currents led to significant errors in cBathy depths that were phase-locked to the tide. In this paper we will test methods for the robust estimation of both depths and vector currents in a tidal delta domain. In contrast to previous Fourier methods, wavenumber estimation in cBathy can be done on small enough scales to resolve interesting nearshore features.

  2. Estimated exposure to phthalates in cosmetics and risk assessment.

    Science.gov (United States)

    Koo, Hyun Jung; Lee, Byung Mu

    2004-12-01

    Some phthalates such as di(2-ethylhexyl) phthalate (DEHP) and dibutyl phthalate (DBP) and their metabolites are suspected of producing teratogenic or endocrine-disrupting effects. To predict possible human exposure to phthalates in cosmetics, the levels of DEHP, diethyl phthalate (DEP), DBP, and butylbenzyl phthalate (BBP) were determined by high-performance liquid chromatography (HPLC) in 102 branded hair sprays, perfumes, deodorants, and nail polishes. DBP was detected in 19 of the 21 nail polishes and in 11 of the 42 perfumes, and DEP was detected in 24 of the 42 perfumes and 2 of the 8 deodorants. Median exposure levels to phthalates in cosmetics by dermal absorption were estimated to be 0.0006 g/kg body weight (bw)/d for DEHP, 0.6 g/kg bw/d for DEP, and 0.103 g/kg bw/d for DBP. Furthermore, if phthalates in cosmetics were assumed to be absorbed exclusively via 100% inhalation, the median daily exposure levels to phthalates in cosmetics were estimated to be 0.026 g/kg bw/d for DEHP, 81.471 g/kg bw/d for DEP, and 22.917 g/kg bw/d for DBP, which are far lower than the regulation levels set buy the Scientific Committee on Toxicity, Ecotoxicity, and the Environment (CSTEE) (37 g/kg bw/d, DEHP), Agency for Toxic Substances and Disease Registry (ATSDR) (7000 g/kg bw/d, DEP), and International Programme on Chemical Safety (IPCS) (66 g/kg bw/d, DBP), respectively. Based on these data, hazard indices (HI, daily exposure level/regulation level) were calculated to be 0.0007 for DEHP, 0.012 for DEP, and 0.347 for DBP. These data suggest that estimated exposure to-phthalates in the cosmetics mentioned are relatively small. However, total exposure levels from several sources may be greater and require further investigation.

  3. Reading fluency estimates of current intellectual function: demographic factors and effects of type of stimuli.

    Science.gov (United States)

    Simos, Panagiotis G; Sideridis, Georgios D; Kasselimis, Dimitrios; Mouzaki, Angeliki

    2013-03-01

    The study explores the potential clinical value of reading fluency measures in complementing demographic variables as indices of current intellectual capacity. IQ estimates (based on the PPVT-R, WASI Vocabulary and Block Design subtests) were obtained from a representative, non-clinical sample of 386 Greek adults aged 48–87 years along with two measures of reading efficiency (one involving relatively high-frequency words—WRE—and the second comprised of phonotactically matched pseudowords—PsWRE). Both reading measures (number of items read correctly in 45 s) accounted for significant portions of variability in demographically adjusted verbal and performance IQ indices. Reading measures provided IQ estimates which were significantly closer to those predicted by demographic variables alone in up to 22% of individuals with fewer than 7 (across all ages) or 13 years of formal education (in the 70–87 year age range). PsWRE scores slightly outperformed WRE scores in predicting a person’s estimated verbal or performance IQ. Results are discussed in the context of previous findings using reading accuracy measures for low-frequency words with exceptional spellings in less transparent orthographic systems such as English.

  4. Experimental Estimating Deflection of a Simple Beam Bridge Model Using Grating Eddy Current Sensors

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2012-07-01

    Full Text Available A novel three-point method using a grating eddy current absolute position sensor (GECS for bridge deflection estimation is proposed in this paper. Real spatial positions of the measuring points along the span axis are directly used as relative reference points of each other rather than using any other auxiliary static reference points for measuring devices in a conventional method. Every three adjacent measuring points are defined as a measuring unit and a straight connecting bar with a GECS fixed on the center section of it links the two endpoints. In each measuring unit, the displacement of the mid-measuring point relative to the connecting bar measured by the GECS is defined as the relative deflection. Absolute deflections of each measuring point can be calculated from the relative deflections of all the measuring units directly without any correcting approaches. Principles of the three-point method and displacement measurement of the GECS are introduced in detail. Both static and dynamic experiments have been carried out on a simple beam bridge model, which demonstrate that the three-point deflection estimation method using the GECS is effective and offers a reliable way for bridge deflection estimation, especially for long-term monitoring.

  5. Estimates o the risks associated with dam failure

    Energy Technology Data Exchange (ETDEWEB)

    Ayyaswamy, P.; Hauss, B.; Hseih, T.; Moscati, A.; Hicks, T.E.; Okrent, D.

    1974-03-01

    The probabilities and potential consequences of dam failure in California, primarily due to large earthquakes, was estimated, taking as examples eleven dams having a relatively large population downstream. Mortalities in the event of dam failure range from 11,000 to 260,000, while damage to property may be as high as $720 million. It was assumed that an intensity IX or X earthquake (on the Modified Mercalli Scale) would be sufficient to completely fail earthen dams. Predictions of dam failure were based on the recurrence times of such earthquakes. For the dams studied, the recurrence intervals for an intensity IX earthquake varied between 20 and 800 years; for an intensity X between 50 and 30,000 years. For the Lake Chabot and San Pablo dams (respectively 20, 30 years recurrent earthquake times for a intensity X) the associated consequences are: 34,000 (Lake Chabot) and 30,000 (San Pablo) people killed; damage $140 million and $77 million. Evaculation was found to ameliorate the consequences slightly in most cases because of the short time available. Calculations are based on demography, and assume 10 foot floodwaters will drown all in their path and destroy all one-unit homes in the flood area. Damage estimates reflect losses incurred by structural damage to buildings and do not include loss of income. Hence the economic impact is probably understated.

  6. Statistical Methods for Estimating the Cumulative Risk of Screening Mammography Outcomes

    NARCIS (Netherlands)

    Hubbard, R.A.; Ripping, T.M.; Chubak, J.; Broeders, M.J.; Miglioretti, D.L.

    2016-01-01

    BACKGROUND: This study illustrates alternative statistical methods for estimating cumulative risk of screening mammography outcomes in longitudinal studies. METHODS: Data from the US Breast Cancer Surveillance Consortium (BCSC) and the Nijmegen Breast Cancer Screening Program in the Netherlands were

  7. Occupational exposure to solar ultraviolet radiation of Polish outdoor workers: risk estimation method and criterion.

    Science.gov (United States)

    Wolska, Agnieszka

    2013-01-01

    This paper presents occupational skin exposure to solar ultraviolet radiation (UVR) of 122 Polish outdoor workers in spring and summer. In 65% of the cases, it was significant and exceeded 10 standard erythema doses (SED) during a work shift. The results provided grounds for (a) modifying hazard assessment based on the skin exposure factor proposed by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and (b) developing a criterion of risk estimation. The modified method uses the UV index (UVI) instead of the geographical latitude and season factor. The skin exposure factor (Wes) of one is the criterion of risk estimation. Risk is low if the estimated value of Wes does not exceed one. If it does, suitable preventive measures are necessary and a corrected skin exposure factor (Wes *) is calculated to minimize its value to at least one. Risk estimated with that method was high in 67% of the cases.

  8. R2 TRI facilities with 1999-2011 risk related estimates throughout the census blockgroup

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset delineates the distribution of estimate risk from the TRI facilities for 1999 - 2011 throughout the census blockgroup of the region using Office of...

  9. Estimation of Newborn Risk for Child or Adolescent Obesity: Lessons from Longitudinal Birth Cohorts

    Science.gov (United States)

    Morandi, Anita; Meyre, David; Lobbens, Stéphane; Kleinman, Ken; Kaakinen, Marika; Rifas-Shiman, Sheryl L.; Vatin, Vincent; Gaget, Stefan; Pouta, Anneli; Hartikainen, Anna-Liisa; Laitinen, Jaana; Ruokonen, Aimo; Das, Shikta; Khan, Anokhi Ali; Elliott, Paul; Maffeis, Claudio; Gillman, Matthew W.

    2012-01-01

    Objectives Prevention of obesity should start as early as possible after birth. We aimed to build clinically useful equations estimating the risk of later obesity in newborns, as a first step towards focused early prevention against the global obesity epidemic. Methods We analyzed the lifetime Northern Finland Birth Cohort 1986 (NFBC1986) (N = 4,032) to draw predictive equations for childhood and adolescent obesity from traditional risk factors (parental BMI, birth weight, maternal gestational weight gain, behaviour and social indicators), and a genetic score built from 39 BMI/obesity-associated polymorphisms. We performed validation analyses in a retrospective cohort of 1,503 Italian children and in a prospective cohort of 1,032 U.S. children. Results In the NFBC1986, the cumulative accuracy of traditional risk factors predicting childhood obesity, adolescent obesity, and childhood obesity persistent into adolescence was good: AUROC = 0·78[0·74–0.82], 0·75[0·71–0·79] and 0·85[0·80–0·90] respectively (all pobesity remained acceptably accurate when applied to the Italian and the U.S. cohort (AUROC = 0·70[0·63–0·77] and 0·73[0·67–0·80] respectively) and the two additional equations for childhood obesity newly drawn from the Italian and the U.S. datasets showed good accuracy in respective cohorts (AUROC = 0·74[0·69–0·79] and 0·79[0·73–0·84]) (all pobesity were converted into simple Excel risk calculators for potential clinical use. Conclusion This study provides the first example of handy tools for predicting childhood obesity in newborns by means of easily recorded information, while it shows that currently known genetic variants have very little usefulness for such prediction. PMID:23209618

  10. How well can adolescents really judge risk? Simple, self reported risk factors out-predict teens' self estimates of personal risk

    Directory of Open Access Journals (Sweden)

    Alexander Persoskie

    2013-01-01

    Full Text Available Recent investigations of adolescents' beliefs about risk have led to surprisingly optimistic conclusions: Teens' self estimates of their likelihood of experiencing various life events not only correlate sensibly with relevant risk factors (Fischhoff et al., 2000, but they also significantly predict later experiencing the events (Bruine de Bruin et al., 2007. Using the same dataset examined in previous investigations, the present study extended these analyses by comparing the predictive value of self estimates of risk to that of traditional risk factors for each outcome. The analyses focused on the prediction of pregnancy, criminal arrest, and school enrollment. Three findings emerged. First, traditional risk factor information tended to out-predict self assessments of risk, even when the risk factors included crude, potentially unreliable measures (e.g., a simple tally of self-reported criminal history and when the risk factors were aggregated in a nonoptimal way (i.e., unit weighting. Second, despite the previously reported correlations between self estimates and outcomes, perceived invulnerability was a problem among the youth: Over half of the teens who became pregnant, half of those who were not enrolled in school, and nearly a third of those who were arrested had, one year earlier, indicated a 0% chance of experiencing these outcomes. Finally, adding self estimates of risk to the other risk factor information produced only small gains in predictive accuracy. These analyses point to the need for greater education about the situations and behaviors that lead to negative outcomes.

  11. F-35 Sustainment: Need for Affordable Strategy, Greater Attention to Risks, and Improved Cost Estimates

    Science.gov (United States)

    2014-09-01

    F - 35 SUSTAINMENT Need for Affordable Strategy, Greater Attention to Risks, and Improved Cost Estimates Report...2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE F - 35 Sustainment: Need for Affordable Strategy, Greater...House of Representatives September 2014 F - 35 SUSTAINMENT Need for Affordable Strategy, Greater Attention to Risks, and Improved Cost Estimates Why

  12. Rotor speed estimation of induction machines by monitoring the stator voltages and currents

    Energy Technology Data Exchange (ETDEWEB)

    Ho, S.Y.S.; Langman, R.A. [Tasmania Univ., Hobart, TAS (Australia)

    1995-12-31

    Accurate measurement of induction motor speed is routinely obtained by using a transducer coupled on the shaft. In many industrial situations, this is not acceptable as there may be no room for a suitable transducer, or else the motor environment may be too unpleasant. It is in theory possible to calculate the speed by monitoring the terminal voltages and currents (plus knowing the angular synchronous speed) and then applying these to the differential equations of motor. Two rotor speed algorithms were investigated. Unsatisfactory results were obtained with an algorithm based on the machine equations in a stationary reference frame because at some stage the algorithm divides zero by zero. To avoid these problems the time varying stator voltages and currents were further transformed into the synchronous reference frame so that they end up with dc electrical quantities. This algorithm of obtaining the tangent of the phase angle, for the determination of the rotor speed, was discussed and tested. The analysis presented in this paper points out that the speed of induction motor may be estimated at about +- 0.1 percent uncertainty from measurement of the stator voltage and current. (author). 5 figs., 5 refs.

  13. Estimating successive cancer risks in Lynch Syndrome families using a progressive three-state model.

    Science.gov (United States)

    Choi, Yun-Hee; Briollais, Laurent; Green, Jane; Parfrey, Patrick; Kopciuk, Karen

    2014-02-20

    Lynch Syndrome (LS) families harbor mutated mismatch repair genes,which predispose them to specific types of cancer. Because individuals within LS families can experience multiple cancers over their lifetime, we developed a progressive three-state model to estimate the disease risk from a healthy (state 0) to a first cancer (state 1) and then to a second cancer (state 2). Ascertainment correction of the likelihood was made to adjust for complex sampling designs with carrier probabilities for family members with missing genotype information estimated using their family's observed genotype and phenotype information in a one-step expectation-maximization algorithm. A sandwich variance estimator was employed to overcome possible model misspecification. The main objective of this paper is to estimate the disease risk (penetrance) for age at a second cancer after someone has experienced a first cancer that is also associated with a mutated gene. Simulation study results indicate that our approach generally provides unbiased risk estimates and low root mean squared errors across different family study designs, proportions of missing genotypes, and risk heterogeneities. An application to 12 large LS families from Newfoundland demonstrates that the risk for a second cancer was substantial and that the age at a first colorectal cancer significantly impacted the age at any LS subsequent cancer. This study provides new insights for developing more effective management of mutation carriers in LS families by providing more accurate multiple cancer risk estimates.

  14. Empirical Analysis of Value-at-Risk Estimation Methods Using Extreme Value Theory

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). Itcompares two different estimation methods, 。two-step subsample bootstrap" based on moment estimation and maximumlikelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimationresults are analyzed together with those of normal method and empirical method. The empirical research of foreignexchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and"two-step subsample bootstrap" method is preferable to MLE.

  15. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...... estimator. A simulation study that serves two purposes is provided. First, it illustrates in details how to implement our proposed nonparametric estimator. Secondly, it facilitates a comparison of the nonparametric estimator to a parametric counterpart based on the estimator of Lu and Liang (2008...

  16. Current Management of Low Risk Differentiated Thyroid Cancer and Papillary Microcarcinoma.

    Science.gov (United States)

    Tarasova, V D; Tuttle, R M

    2017-01-10

    Each year, the proportion of thyroid cancer patients presenting with low risk disease is increasing. Moreover, the definition of low risk thyroid cancer is expanding and several histological subtypes beyond papillary microcarcinomas are now classified as low risk disease. This shift in the landscape of thyroid cancer presentation is forcing clinicians to critically re-evaluate whether or not traditional management paradigms that were effective in treating intermediate and high risk disease are applicable to these low risk patients. Here we review the definition of low risk disease, examine the various histological subtypes that are considered low risk in the 2015 American Thyroid Association guidelines for the management of thyroid nodules and thyroid cancer, and review our current approach to the management of these low risk tumours.

  17. Searching For Causes Of The Current Financial Crisis: On Risk Underassessment And Ignorance

    Directory of Open Access Journals (Sweden)

    Alexandra HOROBET

    2010-05-01

    Full Text Available The paper presents the author’s views on one underlying cause of the current financial crisis: the erroneous assessment of risk by market participants. We argue that besides othe r convincing explanations of the causes behind the recent financial tu rmoil such as loose government and central bank policies, the process of market liberalization, investors’ appetite for risk and excessive financial modeling, we need to recognize the fact that underassessment of risk by market participants and policy makers is a major cause for the current crises, as for previous ones. This underestimation of risk comes largely as a result of two realities: first, risk magnitude was not entirely known by market participants, due to an interlinking of securities, structures, and derivatives built around the subprime mortgages; second, market participants ignored these risks to a large extent, in a world were access to financial markets was as ea sy as never before.

  18. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    Science.gov (United States)

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  19. Stroke risk estimation across nine European countries in the MORGAM project

    DEFF Research Database (Denmark)

    Borglykke, Anders; Andreasen, Anne H; Kuulasmaa, Kari

    2010-01-01

    Previous tools for stroke risk assessment have either been developed for specific populations or lack data on non-fatal events or uniform data collection. The purpose of this study was to develop a stepwise model for the estimation of 10 year risk of stroke in nine different countries across Europe....

  20. Work-site musculoskeletal pain risk estimates by trained observers - a prospective cohort study

    NARCIS (Netherlands)

    Coenen, P.; Kingma, I.; Boot, C.R.L.; Douwes, M.; Bongers, P.M.; Dieën, J.H. van

    2012-01-01

    Work-related musculoskeletal pain (MSP) risk assessments by trained observers are often used in ergonomic practice; however, the validity may be questionable. We investigated the predictive value of work-site MSP risk estimates in a prospective cohort study of 1745 workers. Trained observers estimat

  1. Estimated occupational risk from bioaerosols generated during land application of Class B biosolids

    Science.gov (United States)

    It has been speculated that bioaerosols generated during land application of biosolids pose a serious occupational risk, but few scientific studies have been performed to assess levels of aerosolization of microorganisms from biosolids and to estimate the occupational risks of infection. This study ...

  2. Work-site musculoskeletal pain risk estimates by trained observers - a prospective cohort study

    NARCIS (Netherlands)

    Coenen, P.; Kingma, I.; Boot, C.R.L.; Douwes, M.; Bongers, P.M.; Dieën, J.H. van

    2012-01-01

    Work-related musculoskeletal pain (MSP) risk assessments by trained observers are often used in ergonomic practice; however, the validity may be questionable. We investigated the predictive value of work-site MSP risk estimates in a prospective cohort study of 1745 workers. Trained observers

  3. An empirical approach to estimate soil erosion risk in Spain.

    Science.gov (United States)

    Martín-Fernández, Luis; Martínez-Núñez, Margarita

    2011-08-01

    Soil erosion is one of the most important factors in land degradation and influences desertification worldwide. In 2001, the Spanish Ministry of the Environment launched the 'National Inventory of Soil Erosion (INES) 2002-2012' to study the process of soil erosion in Spain. The aim of the current article is to assess the usefulness of this National Inventory as an instrument of control, measurement and monitoring of soil erosion in Spain. The methodology and main features of this National Inventory are described in detail. The results achieved as of the end of May 2010 are presented, together with an explanation of the utility of the Inventory as a tool for planning forest hydrologic restoration, soil protection, erosion control, and protection against desertification. Finally, the authors make a comparative analysis of similar initiatives for assessing soil erosion in other countries at the national and European levels.

  4. Cardiovascular risk factors and estimated 10-year risk of fatal cardiovascular events using various equations in Greeks with metabolic syndrome.

    Science.gov (United States)

    Chimonas, Theodoros; Athyros, Vassilios G; Ganotakis, Emmanouel; Nicolaou, Vassilios; Panagiotakos, Demosthenes B; Mikhailidis, Dimitri P; Elisaf, Moses

    2010-01-01

    We investigated cardiovascular disease (CVD) risk factors in 1501 Greeks (613 men and 888 women, aged 40-65 years) referred to outpatients with metabolic syndrome (MetS) and without diabetes mellitus or CVD. The 10-year risk of fatal CVD events was calculated using European Society of Cardiology Systematic Coronary Risk Estimation (ESC SCORE), Hellenic-SCORE, and Framingham equations. Raised blood pressure (BP) and hypertriglyceridemia were more common in men (89.6% vs 84.2% and 86.8% vs 74.2%, respectively; P < .001). Low high-density lipoprotein cholesterol (HDL-C) and abdominal obesity were more common in women (58.2% vs 66.2% and 85.8% vs 97.1%, respectively; P < .001). The 10-year risk of fatal CVD events using HellenicSCORE was higher in men (6.3% +/- 4.3% vs 2.7% +/- 2.1%; P < .001). European Society of Cardiology Systematic Coronary Risk Estimation and Framingham yielded similar results. The risk equations gave similar assessments in a European Mediterranean population except for HellenicSCORE that calculated more MetS women requiring risk modification. This might justify local risk engine evaluation in event-based studies. (Clinical-Trials.gov ID: NCT00416741).

  5. Estimation of risk of developing bladder cancer among workers exposed to coal tar pitch volatiles in the primary aluminum industry.

    Science.gov (United States)

    Tremblay, C; Armstrong, B; Thériault, G; Brodeur, J

    1995-03-01

    To confirm the relationship between exposure to coal tar pitch volatiles and bladder cancer among primary aluminum production workers, we carried out a case-control study among blue-collar workers who had worked more than 1 year between 1950-1979 in a major plant using mostly the Soderberg process in the Province of Québec. Cases of bladder cancer (ICD code 188) diagnosed between 1970-1979 (n = 69) were mostly included in a previously reported study. To these were added cases diagnosed between 1980-1988 (n = 69). Each case was matched to three controls on date of birth, date of hiring, and length of service at the company. Smoking habits were assessed from the medical records at the company. Benzene-soluble matter (BSM) and benzo(a)pyrene (BaP) were used as indicators of environmental exposure to coal tar pitch volatiles in the workplace. The estimated risk for current smokers was 2.63 (95% C.I. 1.29-5.37). Estimates of risk by occupational exposure were adjusted for smoking. Men who had worked in the Soderberg potrooms were at higher risk of developing the disease, the risk increasing with the time spent in these departments. Similarly, a strong association between risk and cumulative exposure to BSM or to BaP was observed. The risks associated with cumulative exposure to BSM (mg/m3-years) and to BaP (microgram/m3-years) were described with mathematical models. Using a linear model (1 + bx) and lagging 10 years before the diagnosis, BaP cumulative exposure was a better indicator of risk than BSM cumulative exposure. The risk for each year of exposure to BaP at a concentration of 1 microgram/m3 increased by 1.7% (0.8%-3.2%). Using the same model for BSM, a worker exposed to the current threshold limit value of 0.2 mg/m3 for 40 years will sustain a risk of 2.22 (1.56-3.48). Comparison of risks according to different periods of diagnosis (1970-1979 vs. 1980-1988) did not reveal any significant temporal changes on risk estimates.

  6. Current source density estimation and interpolation based on the spherical harmonic Fourier expansion.

    Science.gov (United States)

    Pascual-Marqui, R D; Gonzalez-Andino, S L; Valdes-Sosa, P A; Biscay-Lirio, R

    1988-12-01

    A method for the spatial analysis of EEG and EP data, based on the spherical harmonic Fourier expansion (SHE) of scalp potential measurements, is described. This model provides efficient and accurate formulas for: (1) the computation of the surface Laplacian and (2) the interpolation of electrical potentials, current source densities, test statistics and other derived variables. Physiologically based simulation experiments show that the SHE method gives better estimates of the surface Laplacian than the commonly used finite difference method. Cross-validation studies for the objective comparison of different interpolation methods demonstrate the superiority of the SHE over the commonly used methods based on the weighted (inverse distance) average of the nearest three and four neighbor values.

  7. Current-source density analysis of slow brain potentials during time estimation.

    Science.gov (United States)

    Gibbons, Henning; Rammsayer, Thomas H

    2004-11-01

    Two event-related potential studies were conducted to investigate differential brain correlates of temporal processing of intervals below and above 3-4 s. In the first experiment, 24 participants were presented with auditorily marked target durations of 2, 4, and 6 s that had to be reproduced. Timing accuracy was similar for all three target durations. As revealed by current-source density analysis, slow-wave components during both presentation and reproduction were independent of target duration. Experiment 2 examined potential modulating effects of type of interval (filled and empty) and presentation mode (randomized and blocked presentation of target durations). Behavioral and slow-wave findings were consistent with those of Experiment 1. Thus, the present findings support the notion of a general timing mechanism irrespective of interval duration as proposed by scalar timing theory and pacemaker-counter models of time estimation.

  8. Parameter Estimation of Inverter and Motor Model at Standstill using Measured Currents Only

    DEFF Research Database (Denmark)

    Rasmussen, Henrik; Knudsen, Morten; Tønnes, M.

    1996-01-01

    Methods for estimation of the parameters in the electrical equivalent diagram for the induction motor, based on special designed experiments, are given. In all experriments two of the three phases are given the same potential, i.e., no net torque is generatedand the motor is at standstill. Input...... to the system is the reference values for the stator voltages given as duty cycles for the Pulse With Modulated power device. The system output is the measured stator currents. Three experiments are describedgiving respectively 1) the stator resistance and inverter parameters, 2) the stator transient inductance...... and 3) the referred rotor rotor resistance and magnetizing inductance. The method developed in the two last experiments is independent of the inverter nonlinearity. New methods for system identification concerning saturation of the magnetic flux are given and a reference value for the flux level...

  9. Model-aided Navigation with Sea Current Estimation for an Autonomous Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    Alain Martinez

    2015-07-01

    Full Text Available This paper presents a strategy to improve the navigation solution of the HRC-AUV by deploying a model-aided inertial navigation system (MA-INS. Based on a simpler three-DOF linear dynamic model (DM of the vehicle, and implemented through a Kalman filter (KF, the performance of the proposed MA-INS is compared to state-of-the-art solutions based on non-linear models. The model allows the online estimation of the sea current parameters before and during the navigation mission. Qualitative and quantitative evaluations as well as a statistical significance test are performed using both simulated and real data, demonstrating the usefulness of the proposed model-aided navigation.

  10. Model-Aided Navigation with Sea Current Estimation for an Autonomous Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    Alain Martinez

    2015-07-01

    Full Text Available This paper presents a strategy to improve the navigation solution of the HRC-AUV by deploying a model-aided inertial navigation system (MA-INS. Based on a simpler three-DOF linear dynamic model (DM of the vehicle, and implemented through a Kalman filter (KF, the performance of the proposed MA-INS is compared to state-of-the-art solutions based on non-linear models. The model allows the online estimation of the sea current parameters before and during the navigation mission. Qualitative and quantitative evaluations as well as a statistical significance test are performed using both simulated and real data, demonstrating the usefulness of the proposed model-aided navigation.

  11. Estimation of the Plant Time Constant of Current-Controlled Voltage Source Converters

    DEFF Research Database (Denmark)

    Vidal, Ana; Yepes, Alejandro G.; Malvar, Jano;

    2014-01-01

    Precise knowledge of the plant time constant is essential to perform a thorough analysis of the current control loop in voltage source converters (VSCs). As the loop behavior can be significantly influenced by the VSC working conditions, the effects associated to converter losses should be included...... of the VSC interface filter measured at rated conditions. This paper extends that method so that both parameters of the plant time constant (resistance and inductance) are estimated. Such enhancement is achieved through the evaluation of the closed-loop transient responses of both axes of the synchronous...... in the model, through an equivalent series resistance. In a recent work, an algorithm to identify this parameter was developed, considering the inductance value as known and practically constant. Nevertheless, the plant inductance can also present important uncertainties with respect to the inductance...

  12. Wave-induced stress and estimation of its driven effect on currents

    Institute of Scientific and Technical Information of China (English)

    SUN Fu; GAO Shan; WANG Wei; QIAN Chengchun

    2004-01-01

    A genuine geostrophic small amplitude wave solution is deduced for the first time from the general form of linear fluid dynamic equations with the f-plane approximation, where the horizontal component of angular velocity of the earth rotation is taken into account. The Coriolisinduced stress obtained from this solution consists of lateral and reverse component, while its first order approximation is reduced to the result of Hasselmann or Xu Zhigang. Accordingly,combining the Coriolis-induced wave stress with the virtual wave stress proposed by Longuet-Higgins, the ratio of total wave-induced stress to wind stress on the sea surface is estimated, through which the importance of the wave-induced stress is emphasized in the study of the currents in the seas around China, especially in the Bohai Sea and the Yellow Sea.

  13. Risk Estimation with Epidemiologic Data When Response Attenuates at High-Exposure Levels

    OpenAIRE

    Steenland, Kyle; Seals, Ryan; Klein, Mitch; Jinot, Jennifer; Kahn, Henry D.

    2011-01-01

    Background In occupational studies, which are commonly used for risk assessment for environmental settings, estimated exposure–response relationships often attenuate at high exposures. Relative risk (RR) models with transformed (e.g., log- or square root–transformed) exposures can provide a good fit to such data, but resulting exposure–response curves that are supralinear in the low-dose region may overestimate low-dose risks. Conversely, a model of untransformed (linear) exposure may underes...

  14. Is Risk Aversion Really Correlated with Wealth? How estimated probabilities introduce spurious correlation

    OpenAIRE

    Lybbert, Travis J.; Just, David R

    2006-01-01

    Economists attribute many common behaviors to risk aversion and frequently focus on how wealth moderates risk preferences. This paper highlights a problem associated with empirical tests of the relationship between wealth and risk aversion that can arise when the probabilities individuals face are unobservable to researchers. The common remedy for unobservable probabilities involves the estimation of probabilities in a profit or production that includes farmer, farm and agro-climatic variable...

  15. Heavy vehicle state estimation and rollover risk evaluation using Kalman Filter and Sliding Mode Observer

    OpenAIRE

    DAKHLALLAH, Jamil; Imine, Hocine; Sellami, Yamine; BELLOT, D

    2007-01-01

    Safety driving is due to the prevention of risks situation, one of the important risk is the rollover of a heavy vehicle. Preventing this accident requires the knowledge of the rollover coefficient which depends on the vehicle dynamic state and other vehicle parameters. Thus, we estimate the vehicle dynamic state using the Extended and Unscented Kalman Filter and the Sliding Mode Observer. Thereafter, we calculate the probability to have a rollover risk using the previous result and Monte-Car...

  16. Comparing self-perceived and estimated fracture risk by FRAX® of women with osteoporosis.

    Science.gov (United States)

    Baji, Petra; Gulácsi, László; Horváth, Csaba; Brodszky, Valentin; Rencz, Fanni; Péntek, Márta

    2017-12-01

    In this study, we compared subjective fracture risks of Hungarian women with osteoporosis to FRAX®-based estimates. Patients with a previous fracture, parental hip fracture, low femoral T-score, higher age, and higher BMI were more likely to underestimate their risks. Patients also failed to associate risk factors with an increased risk of fractures. The main objectives were to explore associations between self-perceived 10-year fracture risks of women with osteoporosis (OP) and their risks calculated by the FRAX® algorithm and to identify determinants of the underestimation of risk. We carried out a cross-sectional study in 11 OP centers in Hungary and collected data on the risk factors considered by the FRAX® calculator. Patients estimated their subjective 10-year probability of any major osteoporotic and hip fracture numerically, in percentages and also on a visual analog scale (VAS). We compared subjective and FRAX® estimates and applied logistic regression to analyze the determinants of the underestimation of risk. Associations between risk factors and subjective risk were explored using linear probability models. Nine hundred seventy-two OP patients were included in the analysis. Major OP and hip fracture risk by FRAX® were on average 20.1 and 10.5%, while subjective estimates were significantly higher, 30.0 and 24.7%, respectively. Correlations between FRAX® and subjective measures were very weak (r = 0.12-0.16). Underestimation of major OP fracture risk was associated with having had a single previous fracture (OR = 2.0), parental hip fracture (OR = 3.4), femoral T-score ≤-2.5 (OR = 4.2), higher age, body mass index, and better general health state. We did not find significant associations between subjective risk estimates and most of the risk factors except for previous fractures. Hungarian OP patients fail to recognize most of the risk factors of fractures. Thus, education of patients about these risk factors would be beneficial especially

  17. Estimating the impacts of fishing on dependent predators: a case study in the California Current.

    Science.gov (United States)

    Field, J C; MacCall, A D; Bradley, R W; Sydeman, W J

    2010-12-01

    Juvenile rockfish (Sebastes spp.) are important prey to seabirds in the California Current System, particularly during the breeding season. Both seabird breeding success and the abundance of pelagic juvenile rockfish show high interannual variability. This covariation is largely a response to variable ocean conditions; however, fishing on adult rockfish may have had consequences for seabird productivity (e.g., the number of chicks fledged per breeding pair) by reducing the availability of juvenile rockfish to provisioning seabird parents. We tested the hypothesis that fishing has decreased juvenile rockfish availability and thereby limited seabird productivity over the past 30 years. We quantified relationships between observed juvenile rockfish relative abundance and seabird productivity, used fisheries stock assessment approaches to estimate the relative abundance of juvenile rockfish in the absence of fishing, and compared the differences in seabird productivity that would have resulted without rockfish fisheries. We examined the abundance of juvenile rockfish and the corresponding productivity of three seabird species breeding on Southeast Farallon Island (near San Francisco, California, USA) from the early 1980s to the present. Results show that while the relative abundance of juvenile rockfish has declined to approximately 50% of the estimated unfished biomass, seabirds achieved 75-95% of the estimated un-impacted levels of productivity, depending upon the species of bird and various model assumptions. These results primarily reflect seabirds with "conservative" life histories (one egg laid per year) and may be different for species with more flexible life history strategies (greater reproductive effort). Our results are consistent with the premise that the impacts of local rockfish fisheries on seabird productivity are less than impacts that have occurred to the prey resources themselves due to ocean climate and the ability of seabirds to buffer against

  18. Air travel and radiation risks - review of current knowledge; Flugreisen und Strahlenrisiken - eine aktuelle Uebersicht

    Energy Technology Data Exchange (ETDEWEB)

    Zeeb, H. [Bielefeld Univ. (Germany). Fakultaet fuer Gesundheitswissenschaften; Blettner, M. [Mainz Univ. (Germany). Inst. fuer Medizinische Biometrie, Epidemiologie und Informatik

    2004-07-01

    Aircrew and passengers are exposed to cosmic radiation, in particular when travelling routes close to the poles and in high altitudes. The paper reviews current radiation measurement and estimation approaches as well as the actual level of cosmic radiation that personnel and travellers receive and summarizes the available epidemiological evidence on health effects of cosmic radiation. On average, German aircrew is exposed to les than 5 mSv per annum, and even frequent travellers only rarely reach values above 1 mSv/year. Cohort studies among aircrew have found very little evidence for an increased incidence or mortality of radiation-associated cancers. Only malignant melanoma rates have consistently found to be increased among male aircrew. Socioeconomic and reproductive aspects are likely to contribute to the slightly elevated breast cancer risk of female aircrew. Cytogenetic studies have not yielded consistent results. Based on these data overall risk increases for cancer among occupationally exposed aircrew appear unlikely. This also applies to air travellers who are usually exposed to much lower radiation levels. Occasional air travel during pregnancy does not pose a significant radiation risk, but further considerations apply in this situation. The currently available studies are limited with regard to methodological issues and case numbers so that a continuation of cohort studies in several European countries is being planned. (orig.) [German] Sowohl Flugpersonal wie Flugreisende sind kosmischer Strahlung ausgesetzt, insbesondere wenn sie auf polnahen Routen und in grossen Flughoehen reisen. Die vorliegende Arbeit gibt einen aktuellen Ueberblick ueber Mess- und Schaetzverfahren sowie das Ausmass der kosmischen Strahlenexposition und fasst die derzeit bekannte epidemiologische Evidenz zu gesundheitlichen Aspekten der kosmischen Strahlenexposition zusammen. Die durchschnittliche jaehrliche Strahlenexposition beruflich exponierten Flugpersonals liegt in

  19. Space Radiation Heart Disease Risk Estimates for Lunar and Mars Missions

    Science.gov (United States)

    Cucinotta, Francis A.; Chappell, Lori; Kim, Myung-Hee

    2010-01-01

    The NASA Space Radiation Program performs research on the risks of late effects from space radiation for cancer, neurological disorders, cataracts, and heart disease. For mortality risks, an aggregate over all risks should be considered as well as projection of the life loss per radiation induced death. We report on a triple detriment life-table approach to combine cancer and heart disease risks. Epidemiology results show extensive heterogeneity between populations for distinct components of the overall heart disease risks including hypertension, ischaemic heart disease, stroke, and cerebrovascular diseases. We report on an update to our previous heart disease estimates for Heart disease (ICD9 390-429) and Stroke (ICD9 430-438), and other sub-groups using recent meta-analysis results for various exposed radiation cohorts to low LET radiation. Results for multiplicative and additive risk transfer models are considered using baseline rates for US males and female. Uncertainty analysis indicated heart mortality risks as low as zero, assuming a threshold dose for deterministic effects, and projections approaching one-third of the overall cancer risk. Medan life-loss per death estimates were significantly less than that of solid cancer and leukemias. Critical research questions to improve risks estimates for heart disease are distinctions in mechanisms at high doses (>2 Gy) and low to moderate doses (<2 Gy), and data and basic understanding of radiation doserate and quality effects, and individual sensitivity.

  20. Is cardiovascular risk in women with PCOS a real risk? Current insights.

    Science.gov (United States)

    Papadakis, Georgios; Kandaraki, Eleni; Papalou, Olga; Vryonidou, Andromachi; Diamanti-Kandarakis, Evanthia

    2017-01-31

    Polycystic ovary syndrome (PCOS) is the most common endocrine disorder in reproductive aged women. PCOS incorporates not only symptoms related to the reproductive system but also a clustering of systemic metabolic abnormalities that are linked with increased risk for cardiovascular disease (CVD). More specifically, metabolic aberrations such as impaired glucose and lipid metabolism, accompanied by increased low-grade inflammation as well as elevated coagulation factors appear to contribute to the increased cardiovascular risk. Even though many studies have indicated a rise in surrogate biomarkers of CVD in women with PCOS, it is still doubtful to what extent and magnitude this elevation can be translated to real cardiovascular events. Furthermore, the cardiovascular risk factors appear to vary significantly in the different phenotypes of the syndrome. Women with PCOS have the potential for early atherosclerosis, myocardial and endothelial dysfunction. Whether or not PCOS women are at real cardiovascular risk compared to controls remains between the verge of theoretical and real threat for the PCOS women at any age but particularly in the post menopausal state. Interestingly, although the presence of the CVD risk factors is well documented in PCOS women, their combination on different phenotypes may play a role, which eventually results in a spectrum of clinical manifestations of CVD with variable degree of severity. The present manuscript aims to review the interaction between PCOS and the combination of several cardiovascular risk factors.

  1. How the risk of liver cancer changes after alcohol cessation: A review and meta-analysis of the current literature

    Directory of Open Access Journals (Sweden)

    Heckley Gawain A

    2011-10-01

    Full Text Available Abstract Background It is well established that drinking alcohol raises the risk of liver cancer (hepatocellular carcinoma. However, it has not been sufficiently established as to whether or not drinking cessation subsequently reduces the risk of liver cancer and if it does reduce the risk how long it takes for this heightened risk to fall to that of never drinkers. This question is important for effective policy design and evaluation, to establish causality and for motivational treatments. Methods A systematic review and meta-analysis using the current available evidence and a specific form of Generalised Least Squares is performed to assess how the risk of liver cancer changes with time for former drinkers. Results Four studies are found to have quantified the effect of drinking cessation on the risk of liver cancer. The meta-analysis suggests that the risk of liver cancer does indeed fall after cessation by 6-7% a year, but there remains a large uncertainty around this estimate both statistically and in its interpretation. As an illustration it is estimated that a time period of 23 years is required after drinking cessation, with a correspondingly large 95% confidence interval of 14 to 70 years, for the risk of liver cancer to be equal to that of never drinkers. Conclusion This is a relatively under researched area and this is reflected in the uncertainty of the findings. It is our view that it is not possible to extrapolate the results found here to the general population. Too few studies have addressed this question and of the studies that have, all have significant limitations. The key issue amongst the relevant studies is that it appears that current drinkers, abstainers and former drinkers are not composed of, or effectively adjusted to be, similar populations making inferences about risk changes impossible. This is a very difficult area to study effectively, but it is an important topic. More work is required to reduce both statistical

  2. A Critical Examination of Current On-Orbit Satellite Collision Risk Analysis Under Constraints of Public Data

    Science.gov (United States)

    Whitworth, Brandon; Moon, Mark; Pace, William; Baker, Robert

    2010-09-01

    The collision of Cosmos 2251 and Iridium 33 on 10 February 2009, made real the dangers of space operations without accurate situational awareness. A critical examination of the state of the art in collision risk assessment for on-orbit assets quickly reveals that it is inadequate to have provided satellite operators the opportunity to prevent the Cosmos-Iridium collision. Satellite operators need reliable information in a timely manner in order to take appropriate action. The shortfalls of publicly available orbit information place all spacecraft and missions at risk. The accuracy limitations of the General Perturbations(GP) catalog and orbit model(SGP-4) limit the effectiveness of current open source efforts. Beyond the accuracy limits, the relatively low frequency of updates for debris included in the catalog increases the uncertainty in time-space for inactive space objects such as Cosmos 2251. The current state of the art collision risk assessment includes advanced techniques such as expanding the GP model with covariance information which will allow uncertainty in the model to be accounted for in the on-orbit risk calculations. Covariance information can be estimated from consecutively published element sets for the same orbital object. A challenge to covariance estimation is that maneuvers or long periods of time between updates can skew the computed data. Once reliable covariance information is known and an efficient algorithm can be applied to find all of the close approaches between all cataloged objects then it is possible to estimate the collision risk for each close encounter with the tri-variate normal distribution. Unknown covariance will need to be handled in an appropriate way for a complete solution. Covariance information alone cannot solve the problem due to the relatively slow rate of update for all objects by the Space Surveillance Network(SSN) and there is no centralized source for planned and executed orbit changes for powered spacecraft. The

  3. Estimation Risk Modeling in Optimal Portfolio Selection: An Empirical Study from Emerging Markets

    Directory of Open Access Journals (Sweden)

    Sarayut Nathaphan

    2010-01-01

    Full Text Available Efficient portfolio is a portfolio that yields maximum expected return given a level of risk or has a minimum level of risk given a level of expected return. However, the optimal portfolios do not seem to be as efficient as intended. Especially during financial crisis period, optimal portfolio is not an optimal investment as it does not yield maximum return given a specific level of risk, and vice versa. One possible explanation for an unimpressive performance of the seemingly efficient portfolio is incorrectness in parameter estimates called “estimation risk in parameter estimates”. Six different estimating strategies are employed to explore ex-post-portfolio performance when estimation risk is incorporated. These strategies are traditional Mean-Variance (EV, Adjusted Beta (AB approach, Resampled Efficient Frontier (REF, Capital Asset Pricing Model (CAPM, Single Index Model (SIM, and Single Index Model incorporating shrinkage Bayesian factor namely, Bayesian Single Index Model (BSIM. Among the six alternative strategies, shrinkage estimators incorporating the single index model outperform other traditional portfolio selection strategies. Allowing for asset mispricing and applying Bayesian shrinkage adjusted factor to each asset's alpha, a single factor namely, excess market return is adequate in alleviating estimation uncertainty.

  4. Estimation model of life insurance claims risk for cancer patients by using Bayesian method

    Science.gov (United States)

    Sukono; Suyudi, M.; Islamiyati, F.; Supian, S.

    2017-01-01

    This paper discussed the estimation model of the risk of life insurance claims for cancer patients using Bayesian method. To estimate the risk of the claim, the insurance participant data is grouped into two: the number of policies issued and the number of claims incurred. Model estimation is done using a Bayesian approach method. Further, the estimator model was used to estimate the risk value of life insurance claims each age group for each sex. The estimation results indicate that a large risk premium for insured males aged less than 30 years is 0.85; for ages 30 to 40 years is 3:58; for ages 41 to 50 years is 1.71; for ages 51 to 60 years is 2.96; and for those aged over 60 years is 7.82. Meanwhile, for insured women aged less than 30 years was 0:56; for ages 30 to 40 years is 3:21; for ages 41 to 50 years is 0.65; for ages 51 to 60 years is 3:12; and for those aged over 60 years is 9.99. This study is useful in determining the risk premium in homogeneous groups based on gender and age.

  5. ESTIMATED SIL LEVELS AND RISK COMPARISONS FOR RELIEF VALVES AS A FUNCTION OF TIME-IN-SERVICE

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2012-03-26

    Risk-based inspection methods enable estimation of the probability of spring-operated relief valves failing on demand at the United States Department of Energy's Savannah River Site (SRS) in Aiken, South Carolina. The paper illustrates an approach based on application of the Frechet and Weibull distributions to SRS and Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD) proof test results. The methodology enables the estimation of ANSI/ISA-84.00.01 Safety Integrity Levels (SILs) as well as the potential change in SIL level due to modification of the maintenance schedule. Current SRS practices are reviewed and recommendations are made for extending inspection intervals. The paper compares risk-based inspection with specific SILs as maintenance intervals are adjusted. Groups of valves are identified in which maintenance times can be extended as well as different groups in which an increased safety margin may be needed.

  6. Estimation of population dose and risk to holding assistants from veterinary X-ray examination in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Hashizume, Tadashi; Suganuma, Tunenori; Shida, Takuo (Azabu Univ., Kanagawa (Japan). School of Veterinary Medicine)

    1989-09-01

    For the estimation of the population doses and risks of stochastic effects to assistants who hold animals during veterinary X-ray examination, a random survey of hospitals and clinics was carried out concerning age distribution of such assistants by groups of facilities. The average organ and tissue dose per examination was evaluated from the experimental data using mean technical factors such as X-ray tube voltage, tube current and field size based on the results of a nationwide survey. The population doses to the assistants were calculated to be about 14 nSv per person per year for the genetically significant dose, 3.5 nSv per person per year for per caput mean marrow dose, 3.3 nSv for the leukemia significant dose and 4.5 nSv for the malignant significant dose, respectively. The total risk of stochastic effects to the Japanese population from holding assistants was estimated using population data and it was estimated to be less than one person per year, but the cancer risks to a number of the assistants were estimated to be more than 4 x 10{sup -5}. (author).

  7. Jumps and Betas: A New Framework for Disentangling and Estimating Systematic Risks

    DEFF Research Database (Denmark)

    Todorov, Viktor; Bollerslev, Tim

    market portfolio, we find the estimated diffusive and jump betas with respect to the market to be quite dif- ferent for many of the stocks. Our findings have direct and important implications for empirical asset pricing finance and practical portfolio and risk management decisions.......We provide a new theoretical framework for disentangling and estimating sensitivity towards systematic diffusive and jump risks in the context of factor pricing models. Our estimates of the sensitivities towards systematic risks, or betas, are based on the notion of increasingly finer sampled...... returns over fixed time intervals. In addition to establish- ing consistency of our estimators, we also derive Central Limit Theorems characterizing their asymptotic distributions. In an empirical application of the new procedures using high-frequency data for forty individual stocks and an aggregate...

  8. Assessment of mycotoxin risk on corn in the Philippines under current and future climate change conditions.

    Science.gov (United States)

    Salvacion, Arnold R; Pangga, Ireneo B; Cumagun, Christian Joseph R

    2015-01-01

    This study attempts to assess the risk of mycotoxins (aflatoxins and fumonisins) contamination on corn in the Philippines under current and projected climate change conditions using fuzzy logic methodology based on the published range of temperature and rainfall conditions that favor mycotoxin development. Based on the analysis, projected climatic change will reduce the risk of aflatoxin contamination in the country due to increased rainfall. In the case of fumonisin contamination, most parts of the country are at a very high risk both under current conditions and the projected climate change conditions.

  9. Classification of motor imagery by means of cortical current density estimation and Von Neumann entropy.

    Science.gov (United States)

    Kamousi, Baharan; Amini, Ali Nasiri; He, Bin

    2007-06-01

    The goal of the present study is to employ the source imaging methods such as cortical current density estimation for the classification of left- and right-hand motor imagery tasks, which may be used for brain-computer interface (BCI) applications. The scalp recorded EEG was first preprocessed by surface Laplacian filtering, time-frequency filtering, noise normalization and independent component analysis. Then the cortical imaging technique was used to solve the EEG inverse problem. Cortical current density distributions of left and right trials were classified from each other by exploiting the concept of Von Neumann entropy. The proposed method was tested on three human subjects (180 trials each) and a maximum accuracy of 91.5% and an average accuracy of 88% were obtained. The present results confirm the hypothesis that source analysis methods may improve accuracy for classification of motor imagery tasks. The present promising results using source analysis for classification of motor imagery enhances our ability of performing source analysis from single trial EEG data recorded on the scalp, and may have applications to improved BCI systems.

  10. A new method for estimating the critical current density of a superconductor from its hysteresis loop

    Energy Technology Data Exchange (ETDEWEB)

    Lal, Ratan, E-mail: rlal_npl_3543@yahoo.i [Superconductivity Division, National Physical Laboratory, Council of Scientific and Industrial Research, Dr. K.S. Krishnan Road, New Delhi 110012 (India)

    2010-02-15

    The critical current density J{sub c} of some of the superconducting samples, calculated on the basis of the Bean's model, shows negative curvature for low magnetic field with a downward bending near H = 0. To avoid this problem Kim's expression of the critical current density, J{sub c} = k/(H{sub 0} + H), where J{sub c} has positive curvature for all H, has been employed by connecting the positive constants k and H{sub 0} with the features of the hysteresis loop of a superconductor. A relation between the full penetration field H{sub p} and the magnetic field H{sub min}, at which the magnetization is minimum, is obtained from the Kim's theory. Taking the value of J{sub c} at H = H{sub p} according to the actual loop width, as in the Bean's theory, and at H = 0 according to an enhanced loop width due to the local internal field, values of k and H{sub 0} are obtained in terms of the magnetization values M{sup +}(-H{sub min}), M{sup -}(H{sub min}), M{sup +}(H{sub p}) and M{sup -}(H{sub p}). The resulting method of estimating J{sub c} from the hysteresis loop turns out to be as simple as the Bean's method.

  11. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    Science.gov (United States)

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies.

  12. 78 FR 36787 - Rechanneling the Current Cardiac Risk Paradigm: Arrhythmia Risk Assessment During Drug...

    Science.gov (United States)

    2013-06-19

    ... the current guidelines, and the importance of a uniform assay schema. Date and Time: The public... look like, the benefits and limitations of the current guidelines, and the importance of a uniform... proarrthymia screening method as an alternative to clinical Thorough QT studies. The workshop, which will...

  13. Estimating the risk of rabies transmission to humans in the U.S.: a delphi analysis

    Directory of Open Access Journals (Sweden)

    Meltzer Martin I

    2010-05-01

    Full Text Available Abstract Background In the United States, the risk of rabies transmission to humans in most situations of possible exposure is unknown. Controlled studies on rabies are clearly not possible. Thus, the limited data on risk has led to the frequent administration of rabies post-exposure prophylaxis (PEP, often in inappropriate circumstances. Methods We used the Delphi method to obtain an expert group consensus estimate of the risk of rabies transmission to humans in seven scenarios of potential rabies exposure. We also surveyed and discussed the merits of recommending rabies PEP for each scenario. Results The median risk of rabies transmission without rabies PEP for a bite exposure by a skunk, bat, cat, and dog was estimated to be 0.05, 0.001, 0.001, and 0.00001, respectively. Rabies PEP was unanimously recommended in these scenarios. However, rabies PEP was overwhelmingly not recommended for non-bite exposures (e.g. dog licking hand but unavailable for subsequent testing, estimated to have less than 1 in 1,000,000 (0.000001 risk of transmission. Conclusions Our results suggest that there are many common situations in which the risk of rabies transmission is so low that rabies PEP should not be recommended. These risk estimates also provide a key parameter for cost-effective models of human rabies prevention and can be used to educate health professionals about situation-specific administration of rabies PEP.

  14. Risk Estimation Modeling and Feasibility Testing for a Mobile eHealth Intervention for Binge Drinking Among Young People: The D-ARIANNA (Digital-Alcohol RIsk Alertness Notifying Network for Adolescents and young adults) Project.

    Science.gov (United States)

    Carrà, Giuseppe; Crocamo, Cristina; Schivalocchi, Alessandro; Bartoli, Francesco; Carretta, Daniele; Brambilla, Giulia; Clerici, Massimo

    2015-01-01

    Binge drinking is common among young people but often relevant risk factors are not recognized. eHealth apps, attractive for young people, may be useful to enhance awareness of this problem. We aimed at developing a current risk estimation model for binge drinking, incorporated into an eHealth app--D-ARIANNA (Digital-Alcohol RIsk Alertness Notifying Network for Adolescents and young adults)--for young people. A longitudinal approach with phase 1 (risk estimation), phase 2 (design), and phase 3 (feasibility) was followed. Risk/protective factors identified from the literature were used to develop a current risk estimation model for binge drinking. Relevant odds ratios were subsequently pooled through meta-analytic techniques with a random-effects model, deriving weighted estimates to be introduced in a final model. A set of questions, matching identified risk factors, were nested in a questionnaire and assessed for wording, content, and acceptability in focus groups involving 110 adolescents and young adults. Ten risk factors (5 modifiable) and 2 protective factors showed significant associations with binge drinking and were included in the model. Their weighted coefficients ranged between -0.71 (school proficiency) and 1.90 (cannabis use). The model, nested in an eHealth app questionnaire, provides in percent an overall current risk score, accompanied by appropriate images. Factors that mostly contribute are shown in summary messages. Minor changes have been realized after focus groups review. Most of the subjects (74%) regarded the eHealth app as helpful to assess binge drinking risk. We could produce an evidence-based eHealth app for young people, evaluating current risk for binge drinking. Its effectiveness will be tested in a large trial.

  15. Mathematical Models for Estimating the Risks of Bovine Spongiform Encephalopathy (BSE).

    Science.gov (United States)

    Al-Zoughool, Mustafa; Cottrell, David; Elsaadany, Susie; Murray, Noel; Oraby, Tamer; Smith, Robert; Krewski, Daniel

    2015-01-01

    When the bovine spongiform encephalopathy (BSE) epidemic first emerged in the United Kingdom in the mid 1980s, the etiology of animal prion diseases was largely unknown. Risk management efforts to control the disease were also subject to uncertainties regarding the extent of BSE infections and future course of the epidemic. As understanding of BSE increased, mathematical models were developed to estimate risk of BSE infection and to predict reductions in risk in response to BSE control measures. Risk models of BSE-transmission dynamics determined disease persistence in cattle herds and relative infectivity of cattle prior to onset of clinical disease. These BSE models helped in understanding key epidemiological features of BSE transmission and dynamics, such as incubation period distribution and age-dependent infection susceptibility to infection with the BSE agent. This review summarizes different mathematical models and methods that have been used to estimate risk of BSE, and discusses how such risk projection models have informed risk assessment and management of BSE. This review also provides some general insights on how mathematical models of the type discussed here may be used to estimate risks of emerging zoonotic diseases when biological data on transmission of the etiological agent are limited.

  16. Current controversies in turner syndrome: Genetic testing, assisted reproduction, and cardiovascular risks

    Directory of Open Access Journals (Sweden)

    Amanda Ackermann

    2014-09-01

    Full Text Available Patients with Turner syndrome (TS require close medical follow-up and management for cardiac abnormalities, growth and reproductive issues. This review summarizes current controversies in this condition, including: 1 the optimal genetic testing for Turner syndrome patients, particularly with respect to identification of Y chromosome material that may increase the patient's risk of gonadoblastoma and dysgerminoma, 2 which patients should be referred for bilateral gonadectomy and the recommended timing of such referral, 3 options for assisted reproduction in these patients and associated risks, 4 the increased risk of mortality associated with pregnancy in this population, and 5 how best to assess and monitor cardiovascular risks.

  17. Relative risk for HIV in India - An estimate using conditional auto-regressive models with Bayesian approach.

    Science.gov (United States)

    Kandhasamy, Chandrasekaran; Ghosh, Kaushik

    2017-02-01

    Indian states are currently classified into HIV-risk categories based on the observed prevalence counts, percentage of infected attendees in antenatal clinics, and percentage of infected high-risk individuals. This method, however, does not account for the spatial dependence among the states nor does it provide any measure of statistical uncertainty. We provide an alternative model-based approach to address these issues. Our method uses Poisson log-normal models having various conditional autoregressive structures with neighborhood-based and distance-based weight matrices and incorporates all available covariate information. We use R and WinBugs software to fit these models to the 2011 HIV data. Based on the Deviance Information Criterion, the convolution model using distance-based weight matrix and covariate information on female sex workers, literacy rate and intravenous drug users is found to have the best fit. The relative risk of HIV for the various states is estimated using the best model and the states are then classified into the risk categories based on these estimated values. An HIV risk map of India is constructed based on these results. The choice of the final model suggests that an HIV control strategy which focuses on the female sex workers, intravenous drug users and literacy rate would be most effective.

  18. Inrush Current Simulation of Two Windings Power Transformer using Machine Parameters Estimated by Design Procedure of Winding Structure

    Science.gov (United States)

    Tokunaga, Yoshitaka; Kubota, Kunihiro

    This paper presents estimation techniques of machine parameters for two windings power transformer using design procedure of winding structure. Especially, it is very difficult to obtain machine parameters for transformers in customers' facilities. Using estimation techniques, machine parameters could be calculated from the only nameplate data of these transformers. Subsequently, EMTP-ATP simulation of the inrush current was carried out using machine parameters estimated by design procedure of winding structure and simulation results were reproduced measured waveforms.

  19. Inrush Current Simulation of Power Transformer using Machine Parameters Estimated by Design Procedure of Winding Structure and Genetic Algorithm

    Science.gov (United States)

    Tokunaga, Yoshitaka

    This paper presents estimation techniques of machine parameters for power transformer using design procedure of transformer and genetic algorithm with real coding. Especially, it is very difficult to obtain machine parameters for transformers in customers' facilities. Using estimation techniques, machine parameters could be calculated from the only nameplate data of these transformers. Subsequently, EMTP-ATP simulation of the inrush current was carried out using machine parameters estimated by techniques developed in this study and simulation results were reproduced measured waveforms.

  20. Are risk estimates biased in follow-up studies of psychosocial factors with low base-line participation?

    DEFF Research Database (Denmark)

    Kaerlev, Linda; Kolstad, Henrik; Hansen, Åse Marie;

    2011-01-01

    Low participation in population-based follow-up studies addressing psychosocial risk factors may cause biased estimation of health risk but the issue has seldom been examined. We compared risk estimates for selected health outcomes among respondents and the entire source population.......Low participation in population-based follow-up studies addressing psychosocial risk factors may cause biased estimation of health risk but the issue has seldom been examined. We compared risk estimates for selected health outcomes among respondents and the entire source population....

  1. Repeat testing is essential when estimating chronic kidney disease prevalence and associated cardiovascular risk.

    Science.gov (United States)

    Brook, M O; Bottomley, M J; Mevada, C; Svistunova, A; Bielinska, A-M; James, T; Kalachik, A; Harden, P N

    2012-03-01

    Investigations into chronic kidney disease (CKD) and cardiovascular disease in the CKD population may be misleading as they are often based on a single test of kidney function. To determine whether repeat testing at 3 months to confirm a diagnosis of CKD impacts on the estimated prevalence of CKD and the estimated 10-year general cardiovascular risk of the CKD population. Blood and urine samples from presumed healthy volunteers were analysed for evidence of CKD on recruitment and again 3 months later. Estimated 10-year cardiovascular risk was calculated using criteria determined by the Framingham study. Preliminary study: 512 volunteers were screened for CKD. Of the initial results, 206 indicated CKD or eGFR within one standard deviation of abnormal, and 142 (69%) of these were retested. Validation study: 528 volunteers were recruited and invited to return for repeat testing. A total of 214 (40.5%) participants provided repeat samples. A single test indicating CKD had a positive predictive value of 0.5 (preliminary) and 0.39 (validation) for repeat abnormalities 3 months later. Participants with CKD confirmed on repeat testing had a significant increase in estimated 10-year cardiovascular risk over the population as a whole (preliminary: 16.5 vs. 11.9%, P risk. Repeat testing for CKD after 3 months significantly reduces the estimated prevalence of disease and identifies a population with true CKD and a cardiovascular risk significantly in excess of the general population.

  2. Vertebral Strength and Estimated Fracture Risk Across the BMI Spectrum in Women.

    Science.gov (United States)

    Bachmann, Katherine N; Bruno, Alexander G; Bredella, Miriam A; Schorr, Melanie; Lawson, Elizabeth A; Gill, Corey M; Singhal, Vibha; Meenaghan, Erinne; Gerweck, Anu V; Eddy, Kamryn T; Ebrahimi, Seda; Koman, Stuart L; Greenblatt, James M; Keane, Robert J; Weigel, Thomas; Dechant, Esther; Misra, Madhusmita; Klibanski, Anne; Bouxsein, Mary L; Miller, Karen K

    2016-02-01

    Somewhat paradoxically, fracture risk, which depends on applied loads and bone strength, is elevated in both anorexia nervosa and obesity at certain skeletal sites. Factor-of-risk (Φ), the ratio of applied load to bone strength, is a biomechanically based method to estimate fracture risk; theoretically, higher Φ reflects increased fracture risk. We estimated vertebral strength (linear combination of integral volumetric bone mineral density [Int.vBMD] and cross-sectional area from quantitative computed tomography [QCT]), vertebral compressive loads, and Φ at L4 in 176 women (65 anorexia nervosa, 45 lean controls, and 66 obese). Using biomechanical models, applied loads were estimated for: 1) standing; 2) arms flexed 90°, holding 5 kg in each hand (holding); 3) 45° trunk flexion, 5 kg in each hand (lifting); 4) 20° trunk right lateral bend, 10 kg in right hand (bending). We also investigated associations of Int.vBMD and vertebral strength with lean mass (from dual-energy X-ray absorptiometry [DXA]) and visceral adipose tissue (VAT, from QCT). Women with anorexia nervosa had lower, whereas obese women had similar, Int.vBMD and estimated vertebral strength compared with controls. Vertebral loads were highest in obesity and lowest in anorexia nervosa for standing, holding, and lifting (p estimated vertebral strength were associated positively with lean mass (R = 0.28 to 0.45, p ≤ 0.0001) in all groups combined and negatively with VAT (R = -[0.36 to 0.38], p estimated vertebral fracture risk (Φ) for holding and bending because of inferior vertebral strength. Despite similar vertebral strength as controls, obese women had higher vertebral fracture risk for standing, holding, and lifting because of higher applied loads from higher body weight. Examining the load-to-strength ratio helps explain increased fracture risk in both low-weight and obese women.

  3. Estimated injury risk for specific injuries and body regions in frontal motor vehicle crashes.

    Science.gov (United States)

    Weaver, Ashley A; Talton, Jennifer W; Barnard, Ryan T; Schoell, Samantha L; Swett, Katrina R; Stitzel, Joel D

    2015-01-01

    Injury risk curves estimate motor vehicle crash (MVC) occupant injury risk from vehicle, crash, and/or occupant factors. Many vehicles are equipped with event data recorders (EDRs) that collect data including the crash speed and restraint status during a MVC. This study's goal was to use regulation-required data elements for EDRs to compute occupant injury risk for (1) specific injuries and (2) specific body regions in frontal MVCs from weighted NASS-CDS data. Logistic regression analysis of NASS-CDS single-impact frontal MVCs involving front seat occupants with frontal airbag deployment was used to produce 23 risk curves for specific injuries and 17 risk curves for Abbreviated Injury Scale (AIS) 2+ to 5+ body region injuries. Risk curves were produced for the following body regions: head and thorax (AIS 2+, 3+, 4+, 5+), face (AIS 2+), abdomen, spine, upper extremity, and lower extremity (AIS 2+, 3+). Injury risk with 95% confidence intervals was estimated for 15-105 km/h longitudinal delta-Vs and belt status was adjusted for as a covariate. Overall, belted occupants had lower estimated risks compared to unbelted occupants and the risk of injury increased as longitudinal delta-V increased. Belt status was a significant predictor for 13 specific injuries and all body region injuries with the exception of AIS 2+ and 3+ spine injuries. Specific injuries and body region injuries that occurred more frequently in NASS-CDS also tended to carry higher risks when evaluated at a 56 km/h longitudinal delta-V. In the belted population, injury risks that ranked in the top 33% included 4 upper extremity fractures (ulna, radius, clavicle, carpus/metacarpus), 2 lower extremity fractures (fibula, metatarsal/tarsal), and a knee sprain (2.4-4.6% risk). Unbelted injury risks ranked in the top 33% included 4 lower extremity fractures (femur, fibula, metatarsal/tarsal, patella), 2 head injuries with less than one hour or unspecified prior unconsciousness, and a lung contusion (4

  4. A Conceptual Analysis of Current Trends in the Evolution of Risk Management Process

    Directory of Open Access Journals (Sweden)

    Monika Wieczorek-Kosmala

    2013-11-01

    Full Text Available Purpose of the article: The traditional idea of risk management is continually evolving as it enjoys growing popularity in corporations. The paper reviews the risk management procedure within the traditional concept and then identifies and discusses the main trends currently observed within the organisation and implementation of this procedure. Scientific aim: The paper aims at identyfing and describing the currently observed trends in the evolution of risk management process. To achieve this, it aims at comparative analysis of solutions within traditional risk management concept and the ideas underpinning the current process of risk management standardisation. It also aims at reviewing the validity of clasiffication of risk treatment techniques. Methodology/methods: The paper represents a conceptual analysis of the current state of affairs and uses the method of comparative analysis and deduction based on the literature review and the lecture of standardisation documents. As a viewpoint paper, it represents au-thor’s own ideas and findings. Findings: The two main trends of risk management evolution should be idetified. The first one is related with strategic dimension of risk management as this procedure is often promoted as an integrated concept. It springs from the regulations of standardisation procedures which aim at unifying the terminology and set of activities from practitioners’ perspective. The second direction of risk management concept evolution is observed within the development of risk financing techniques due to the innovations observed within traditional risk retention and trasfer solutions, as a result of continuous convergence of insurance and capital markets. Conclusions: (limits, implications etc The risk management process is constantly evolving toward the strategic dimension as the risk perception changes, concerning both the downside and upside of risk. However, the standards follow similar sequence of

  5. Estimating systematic risk for the best investment decisions on manufacturing company in Indonesia

    Directory of Open Access Journals (Sweden)

    Zarah Puspitaningtyas

    2017-03-01

    Full Text Available Estimation of systematic risk is one of the important aspects of the best investment decisions. Through systematic risk prediction will be known risks to be faced by investors, because systematic risk is a measure of investment risk. In addition to returns, investors always consider the risk of investment, because investors are rational individuals, ie individuals who always consider the trade-off between return and risk. At a certain level of return, investors will tend to choose investments with the lowest risk level. Conversely, at a certain level of risk, investors tend to choose investments with the highest return rate. The purpose of this paper is to analyze the influence of the financial information on the systematic risk of stock manufacturing companies listed on the Indonesia Stock Exchange over a period of five years from January 2011 to December 2015. The financial information is measured in four accounting variables, i.e. financial leverage, liquidity, profitability, and firm size. The results of data analysis using multiple linear regression method to prove that at the 0.05 level only variable sized companies that significantly influence systematic risk. Meanwhile, the variable financial leverage, liquidity, and profitability does not affect the systematic risk. The results showed inconsistencies with the results of several previous studies. This inconsistency may be due to measurement problems variable accounting, the implementation period of the study, and the use of different research samples.

  6. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-04-26

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... the proposed rule, ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  7. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-02-19

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  8. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-11-20

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... 3646), entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  9. Estimating the success of enzyme bioprospecting through metagenomics: current status and future trends.

    Science.gov (United States)

    Ferrer, Manuel; Martínez-Martínez, Mónica; Bargiela, Rafael; Streit, Wolfgang R; Golyshina, Olga V; Golyshin, Peter N

    2016-01-01

    Recent reports have suggested that the establishment of industrially relevant enzyme collections from environmental genomes has become a routine procedure. Across the studies assessed, a mean number of approximately 44 active clones were obtained in an average size of approximately 53,000 clones tested using naïve screening protocols. This number could be significantly increased in shorter times when novel metagenome enzyme sequences obtained by direct sequencing are selected and subjected to high-throughput expression for subsequent production and characterization. The pre-screening of clone libraries by naïve screens followed by the pyrosequencing of the inserts allowed for a 106-fold increase in the success rate of identifying genes encoding enzymes of interest. However, a much longer time, usually on the order of years, is needed from the time of enzyme identification to the establishment of an industrial process. If the hit frequency for the identification of enzymes performing at high turnover rates under real application conditions could be increased while still covering a high natural diversity, the very expensive and time-consuming enzyme optimization phase would likely be significantly shortened. At this point, it is important to review the current knowledge about the success of fine-tuned naïve- and sequence-based screening protocols for enzyme selection and to describe the environments worldwide that have already been subjected to enzyme screen programmes through metagenomic tools. Here, we provide such estimations and suggest the current challenges and future actions needed before environmental enzymes can be successfully introduced into the market.

  10. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    Science.gov (United States)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  11. Assessing the effect of estimation error on risk-adjusted CUSUM chart performance.

    Science.gov (United States)

    2015-12-01

    Mark A. Jones, Stefan H. Steiner. Assessing the effect of estimation error on risk-adjusted CUSUM chart performance. Int J Qual Health Care (2012) 24(2): 176–181 doi: 10.1093/intqhc/mzr082. The authors would like to correct an error identified in the above paper. Table 5 included incorrect information. The correct table has been reprinted below. Furthermore, in the discussion on p. 180 of this paper, one of the incorrect numbers in Table 5 was quoted. This section is reproduced below with the correct numbers. In the case of homogeneous patients where adverse event risk was assumed to be constant at 6.6% the estimated level of estimation error: SD (ARL0) = 85.9 was less than the equivalent risk-adjusted scenario where SD (ARL0) = 89.2 but only by around 4%.

  12. A METHOD FOR THE ESTIMATION OF TSUNAMI RISK ALONG RUSSIA’s FAR EAST

    Directory of Open Access Journals (Sweden)

    G.V. Shevchenko

    2013-10-01

    Full Text Available A simplified method was developed for estimating the tsunami risk for a coast for possible events having recurrence periods of 50 and 100 years. The method is based on readily available seismic data and the calculation of magnitudes of events with specified return periods. A classical Gumbel statistical method was used to estimate magnitudes of small probability events. The tsunami numerical modeling study used the average earthquake coordinates in the Kuril-Kamchatka high- seismic area. The verification and testing of the method were carried out using events from the North, Middle and South Kuril Islands – the most tsunami-risk areas of Russia’s Far East. Also, the study used the regional Kuril-Kamchatka catalogue of earthquakes from 1900 to 2008 - which included earthquakes with magnitudes of at least M=6. The results of the study indicate that the proposed methodology provides reasonable estimates of tsunami risk.

  13. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    Energy Technology Data Exchange (ETDEWEB)

    Priharti, W.; Samat, S. B.; Yasir, M. S. [School of Applied Physics, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, 43600 UKM Bangi, Selangor (Malaysia)

    2015-09-25

    The radionuclides of {sup 226}Ra, {sup 232}Th and {sup 40}K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10{sup −3} (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  14. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    Science.gov (United States)

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-01

    The radionuclides of 226Ra, 232Th and 40K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10-3 (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  15. Estimated Risk Level of Unified Stereotactic Body Radiation Therapy Dose Tolerance Limits for Spinal Cord.

    Science.gov (United States)

    Grimm, Jimm; Sahgal, Arjun; Soltys, Scott G; Luxton, Gary; Patel, Ashish; Herbert, Scott; Xue, Jinyu; Ma, Lijun; Yorke, Ellen; Adler, John R; Gibbs, Iris C

    2016-04-01

    A literature review of more than 200 stereotactic body radiation therapy spine articles from the past 20 years found only a single article that provided dose-volume data and outcomes for each spinal cord of a clinical dataset: the Gibbs 2007 article (Gibbs et al, 2007(1)), which essentially contains the first 100 stereotactic body radiation therapy (SBRT) spine treatments from Stanford University Medical Center. The dataset is modeled and compared in detail to the rest of the literature review, which found 59 dose tolerance limits for the spinal cord in 1-5 fractions. We partitioned these limits into a unified format of high-risk and low-risk dose tolerance limits. To estimate the corresponding risk level of each limit we used the Gibbs 2007 clinical spinal cord dose-volume data for 102 spinal metastases in 74 patients treated by spinal radiosurgery. In all, 50 of the patients were previously irradiated to a median dose of 40Gy in 2-3Gy fractions and 3 patients developed treatment-related myelopathy. These dose-volume data were digitized into the dose-volume histogram (DVH) Evaluator software tool where parameters of the probit dose-response model were fitted using the maximum likelihood approach (Jackson et al, 1995(3)). Based on this limited dataset, for de novo cases the unified low-risk dose tolerance limits yielded an estimated risk of spinal cord injury of ≤1% in 1-5 fractions, and the high-risk limits yielded an estimated risk of ≤3%. The QUANTEC Dmax limits of 13Gy in a single fraction and 20Gy in 3 fractions had less than 1% risk estimated from this dataset, so we consider these among the low-risk limits. In the previously irradiated cohort, the estimated risk levels for 10 and 14Gy maximum cord dose limits in 5 fractions are 0.4% and 0.6%, respectively. Longer follow-up and more patients are required to improve the risk estimates and provide more complete validation.

  16. Comprehensive estimation model of MERGE: adaptation to current state of world economy

    Directory of Open Access Journals (Sweden)

    Boris Vadimovich Digas

    2013-09-01

    Full Text Available The optimizing interdisciplinary MERGE model meant mainly for quantitative estimation of outcomes of various nature protection strategy is one of the tools used for studying the climate change problems. Components of a model are the economical and power module, climatic module and module of damage assessment. The main attention in work is paid to the MERGE model adaptation to a world economy current state, and analysis of possible trajectories of economic development of Russia and studying of consequences of country participation in initiatives for emission abatement of greenhouse gases at the various assumptions on dynamics of regional economic and power indicators. As a source of model scenarios of development of the Russian economy, the forecast of long-term socioeconomic development of the country for the period up to 2030 is used. They made by the Ministry of Economic Development of the Russian Federation (namely, the conservative, innovative and forced scenarios defined by different models of state policy for ensuring macroeconomic balance are considered

  17. Identification of effective screening strategies for cardiovascular disease prevention in a developing country: using cardiovascular risk-estimation and risk-reduction tools for policy recommendations.

    Science.gov (United States)

    Selvarajah, Sharmini; Haniff, Jamaiyah; Kaur, Gurpreet; Guat Hiong, Tee; Bujang, Adam; Chee Cheong, Kee; Bots, Michiel L

    2013-02-25

    Recent increases in cardiovascular risk-factor prevalences have led to new national policy recommendations of universal screening for primary prevention of cardiovascular disease in Malaysia. This study assessed whether the current national policy recommendation of universal screening was optimal, by comparing the effectiveness and impact of various cardiovascular screening strategies. Data from a national population based survey of 24 270 participants aged 30 to 74 was used. Five screening strategies were modelled for the overall population and by gender; universal and targeted screening (four age cut-off points). Screening strategies were assessed based on the ability to detect high cardiovascular risk populations (effectiveness), incremental effectiveness, impact on cardiovascular event prevention and cost of screening. 26.7% (95% confidence limits 25.7, 27.7) were at high cardiovascular risk, men 34.7% (33.6, 35.8) and women 18.9% (17.8, 20). Universal screening identified all those at high-risk and resulted in one high-risk individual detected for every 3.7 people screened, with an estimated cost of USD60. However, universal screening resulted in screening an additional 7169 persons, with an incremental cost of USD115,033 for detection of one additional high-risk individual in comparison to targeted screening of those aged ≥35 years. The cost, incremental cost and impact of detection of high-risk individuals were more for women than men for all screening strategies. The impact of screening women aged ≥45 years was similar to universal screening in men. Targeted gender- and age-specific screening strategies would ensure more optimal utilisation of scarce resources compared to the current policy recommendations of universal screening.

  18. Estimation of the Current Peak Value Distribution of All Lightning to the Ground by Electro-Geometric Model

    Science.gov (United States)

    Sakata, Tadashi; Yamamoto, Kazuo; Sekioka, Shozo; Yokoyama, Shigeru

    When we examine the lightning frequency and the lightning shielding effect by EGM (electro-geometric model), we need the current distribution of all lightning to the ground. The lightning current distribution to structures is different from this distribution, but it has been used in EGM conventionally. We assumed the lightning striking distance coefficient related to height of structures for getting the result which corresponds to observed lightning frequency to structures, and estimated the current distribution of all lightning to the ground from data listed in IEC 62305 series by EGM. The estimated distribution adjusted by detection efficiency of LLS almost corresponded to observed distribution by LLS.

  19. Earthquake Risk Management Strategy Plan Using Nonparametric Estimation of Hazard Rate

    Directory of Open Access Journals (Sweden)

    Aflaton Amiri

    2008-01-01

    Full Text Available Earthquake risk is defined as the product of hazard and vulnerability studies. The main aims of earthquake risk management are to make plans and apply those for reducing human losses and protect properties from earthquake hazards. Natural risk managers are studying to identify and manage the risk from an earthquake for highly populated urban areas. They want to put some strategic plans for this purpose. Risk managers need some input about these kinds of studies. The prediction of earthquake events such as a input for preparation of earthquake risk management strategy plans were tried to find in this study. A Bayesian approach to earthquake hazard rate estimation is studied and magnitudes of historical earthquakes is used to predict the probability of occurrence of major earthquakes.

  20. Research on the method of information system risk state estimation based on clustering particle filter

    Directory of Open Access Journals (Sweden)

    Cui Jia

    2017-05-01

    Full Text Available With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  1. Research on the method of information system risk state estimation based on clustering particle filter

    Science.gov (United States)

    Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua

    2017-05-01

    With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  2. Estimating U.S. Methane Emissions from the Natural Gas Supply Chain. Approaches, Uncertainties, Current Estimates, and Future Studies

    Energy Technology Data Exchange (ETDEWEB)

    Heath, Garvin [Joint Inst. for Strategic Energy Analysis, Golden, CO (United States); Warner, Ethan [Joint Inst. for Strategic Energy Analysis, Golden, CO (United States); Steinberg, Daniel [Joint Inst. for Strategic Energy Analysis, Golden, CO (United States); Brandt, Adam [Stanford Univ., CA (United States)

    2015-08-01

    A growing number of studies have raised questions regarding uncertainties in our understanding of methane (CH4) emissions from fugitives and venting along the natural gas (NG) supply chain. In particular, a number of measurement studies have suggested that actual levels of CH4 emissions may be higher than estimated by EPA" tm s U.S. GHG Emission Inventory. We reviewed the literature to identify the growing number of studies that have raised questions regarding uncertainties in our understanding of methane (CH4) emissions from fugitives and venting along the natural gas (NG) supply chain.

  3. Estimation of newborn risk for child or adolescent obesity: lessons from longitudinal birth cohorts.

    Directory of Open Access Journals (Sweden)

    Anita Morandi

    Full Text Available OBJECTIVES: Prevention of obesity should start as early as possible after birth. We aimed to build clinically useful equations estimating the risk of later obesity in newborns, as a first step towards focused early prevention against the global obesity epidemic. METHODS: We analyzed the lifetime Northern Finland Birth Cohort 1986 (NFBC1986 (N = 4,032 to draw predictive equations for childhood and adolescent obesity from traditional risk factors (parental BMI, birth weight, maternal gestational weight gain, behaviour and social indicators, and a genetic score built from 39 BMI/obesity-associated polymorphisms. We performed validation analyses in a retrospective cohort of 1,503 Italian children and in a prospective cohort of 1,032 U.S. children. RESULTS: In the NFBC1986, the cumulative accuracy of traditional risk factors predicting childhood obesity, adolescent obesity, and childhood obesity persistent into adolescence was good: AUROC = 0·78[0·74-0.82], 0·75[0·71-0·79] and 0·85[0·80-0·90] respectively (all p<0·001. Adding the genetic score produced discrimination improvements ≤1%. The NFBC1986 equation for childhood obesity remained acceptably accurate when applied to the Italian and the U.S. cohort (AUROC = 0·70[0·63-0·77] and 0·73[0·67-0·80] respectively and the two additional equations for childhood obesity newly drawn from the Italian and the U.S. datasets showed good accuracy in respective cohorts (AUROC = 0·74[0·69-0·79] and 0·79[0·73-0·84] (all p<0·001. The three equations for childhood obesity were converted into simple Excel risk calculators for potential clinical use. CONCLUSION: This study provides the first example of handy tools for predicting childhood obesity in newborns by means of easily recorded information, while it shows that currently known genetic variants have very little usefulness for such prediction.

  4. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    Science.gov (United States)

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  5. RiD: A New Approach to Estimate the Insolvency Risk

    Directory of Open Access Journals (Sweden)

    Marco Aurélio dos Santos Sanfins

    2014-10-01

    Full Text Available Given the recent international crises and the increasing number of defaults, several researchers have attempted to develop metrics that calculate the probability of insolvency with higher accuracy. The approaches commonly used, however, do not consider the credit risk nor the severity of the distance between receivables and obligations among different periods. In this paper we mathematically present an approach that allow us to estimate the insolvency risk by considering not only future receivables and obligations, but the severity of the distance between them and the quality of the respective receivables. Using Monte Carlo simulations and hypothetical examples, we show that our metric is able to estimate the insolvency risk with high accuracy. Moreover, our results suggest that in the absence of a smooth distribution between receivables and obligations, there is a non-null insolvency risk even when the present value of receivables is larger than the present value of the obligations.

  6. Carrying-over toxicokinetic model uncertainty into cancer risk estimates. The TCDD example

    Energy Technology Data Exchange (ETDEWEB)

    Edler, L. [Division of Biostatistics, German Cancer Research Center, Heidelberg (Germany); Heinzl, H.; Mittlboeck, M. [Medical Univ. of Vienna (Austria). Dept. of Medical Computer Sciences

    2004-09-15

    Estimation of human cancer risks depends on the assessment of exposure to the investigated hazardous compound as well as on its toxicokinetic and toxicodynamic in the body. Modeling these processes constitutes a basic prerequisite for any quantitative risk assessment including assessment of the uncertainty of risk estimates. Obviously, the modeling process itself is part of the risk assessment task, and it affects the development of valid risk estimates. Due to the wealth of information available on exposure and effects in humans and animals 2,3,7,8 tetrachlorodibenzo-pdioxin (TCDD) provides an excellent example to elaborate methods which allow a quantitative analysis of the uncertainty of TCDD risk estimates, and which show how toxicokinetic model uncertainty carries over to risk estimate uncertainty and uncertainty of the dose-response relationship. Cancer is usually considered as a slowly evolving disease. An increase in TCDD dose may result in an increase of the observable cancer response not until some latency time period has elapsed. This fact needs careful consideration when a dose-response relationship is to be established. Toxicokinetic models are capable to reconstruct TCDD exposure concentrations during a lifetime such that time-dependent TCDD dose metrics like the area under the concentration-time curve (AUC) can be constructed for each individual cohort member. Two potentially crucial model assumptions for estimating the exposure of a person are the assumption of lifetime constancy of total lipid volume (TLV) of the human body and the assumption of a simple linear kinetic of TCDD elimination. In 1995 a modified Michaelis-Menten kinetic (also known as Carrier kinetic) has been suggested to link the TCDD elimination rate to the available TCDD amount in the body. That is, TCDD elimination would be faster, of nearly the same rate, or slower under this kinetic than under a simple linear kinetic when the individual would be highly, moderately, or slightly

  7. FN-curves: preliminary estimation of severe accident risks after Fukushima

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Costa, Antonio Carlos Lopes da, E-mail: vasconv@cdtn.br, E-mail: soaresw@cdtn.br, E-mail: aclc@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    Doubts of whether the risks related to severe accidents in nuclear reactors are indeed very low were raised after the nuclear accident at Fukushima Daiichi in 2011. Risk estimations of severe accidents in nuclear power plants involve both probability and consequence assessment of such events. Among the ways to display risks, risk curves are tools that express the frequency of exceeding a certain magnitude of consequence. Societal risk is often represented graphically in a FN-curve, a type of risk curve, which displays the probability of having N or more fatalities per year, as a function of N, on a double logarithmic scale. The FN-curve, originally introduced for the assessment of the risks in the nuclear industry through the U.S.NRC Reactor Safety Study WASH-1400 (1975), is used in various countries to express and limit risks of hazardous activities. This first study estimated an expected rate of core damage equal to 5x10{sup -5} by reactor-year and suggested an upper bound of 3x10{sup -4} by reactor-year. A more recent report issued by Electric Power Research Institute - EPRI (2008) estimates a figure of the order of 2x10{sup -5} by reactor-year. The Fukushima nuclear accident apparently implies that the observed core damage frequency is higher than that predicted by these probabilistic safety assessments. Therefore, this paper presents a preliminary analyses of the FN-curves related to severe nuclear reactor accidents, taking into account a combination of available data of past accidents, probability modelling to estimate frequencies, and expert judgments. (author)

  8. Estimation of the value-at-risk parameter: Econometric analysis and the extreme value theory approach

    Directory of Open Access Journals (Sweden)

    Mladenović Zorica

    2006-01-01

    Full Text Available In this paper different aspects of value-at-risk estimation are considered. Daily returns of CISCO, INTEL and NASDAQ stock indices are analyzed for period: September 1996 - September 2006. Methods that incorporate time varying variability and heavy tails of the empirical distributions of returns are implemented. The main finding of the paper is that standard econometric methods underestimate the value-at-risk parameter if heavy tails of the empirical distribution are not explicitly taken into account. .

  9. Thromboembolic risk in 16 274 atrial fibrillation patients undergoing direct current cardioversion with and without oral anticoagulant therapy

    DEFF Research Database (Denmark)

    Hansen, Morten Lock; Jepsen, Rikke Malene H G; Olesen, Jonas Bjerring;

    2015-01-01

    AIMS: To study the risk of thromboembolism in a nationwide cohort of atrial fibrillation patients undergoing direct current (DC) cardioversion with or without oral anticoagulant coverage. METHODS AND RESULTS: A retrospective study of 16 274 patients in Denmark discharged from hospital after a first......-time DC cardioversion for atrial fibrillation between 2000 and 2008. Use of oral anticoagulant therapy within 90 days prior and 360 days after DC cardioversion was obtained from the Danish Register of Medicinal Product Statistics. The risk of thromboembolism was estimated by calculating incidence rates...... and by multivariable adjusted Cox proportional-hazard models. During the initial 30 days following discharge, the thromboembolic incidence rate was 10.33 per 100 patient-years for the no prior oral anticoagulant therapy group [n = 5084 (31.2%)], as compared with 4.00 per 100 patient-years for the prior oral...

  10. Urban micro-scale flood risk estimation with parsimonious hydraulic modelling and census data

    Directory of Open Access Journals (Sweden)

    C. Arrighi

    2013-05-01

    Full Text Available The adoption of 2007/60/EC Directive requires European countries to implement flood hazard and flood risk maps by the end of 2013. Flood risk is the product of flood hazard, vulnerability and exposure, all three to be estimated with comparable level of accuracy. The route to flood risk assessment is consequently much more than hydraulic modelling of inundation, that is hazard mapping. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. A parsimonious quasi-2-D hydraulic model is here adopted, having many advantages in terms of easy set-up. It is here evaluated as being accurate in flood depth estimation in urban areas with a high-resolution and up-to-date Digital Surface Model (DSM. The accuracy, estimated by comparison with marble-plate records of a historic flood in the city of Florence, is characterized in the downtown's most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 € m−2 yr−1, corresponding to about 8.3% of residents' income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the building characteristics. As a final example application, the estimated risk values have been used to compare different retrofitting measures. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This is based on the georeferenced census system as the optimal compromise between spatial detail and open availability of socio-economic data. The results of flood risk assessment at the census section scale resolve most of the risk spatial variability, and they can be easily aggregated to whatever upper scale is needed given that they are geographically defined as contiguous polygons. Damage is calculated through stage–damage curves, starting from census data on building type and

  11. Estimated risk of radiation-induced cancer from paediatric chest CT: two-year cohort study

    Energy Technology Data Exchange (ETDEWEB)

    Niemann, Tilo [Cantonal Hospital Baden, Department of Radiology, Baden (Switzerland); University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette, Lille (France); Colas, Lucie; Santangelo, Teresa; Faivre, Jean Baptiste; Remy, Jacques; Remy-Jardin, Martine [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette, Lille (France); Roser, Hans W.; Bremerich, Jens [University of Basel Hospital, Clinic of Radiology and Nuclear Medicine, Medical Physics, Basel (Switzerland)

    2015-03-01

    The increasing absolute number of paediatric CT scans raises concern about the safety and efficacy and the effects of consecutive diagnostic ionising radiation. To demonstrate a method to evaluate the lifetime attributable risk of cancer incidence/mortality due to a single low-dose helical chest CT in a two-year patient cohort. A two-year cohort of 522 paediatric helical chest CT scans acquired using a dedicated low-dose protocol were analysed retrospectively. Patient-specific estimations of radiation doses were modelled using three different mathematical phantoms. Per-organ attributable cancer risk was then estimated using epidemiological models. Additional comparison was provided for naturally occurring risks. Total lifetime attributable risk of cancer incidence remains low for all age and sex categories, being highest in female neonates (0.34%). Summation of all cancer sites analysed raised the relative lifetime attributable risk of organ cancer incidence up to 3.6% in female neonates and 2.1% in male neonates. Using dedicated scan protocols, total lifetime attributable risk of cancer incidence and mortality for chest CT is estimated low for paediatric chest CT, being highest for female neonates. (orig.)

  12. Estimating the sizes of populations at high risk for HIV: a comparison study.

    Directory of Open Access Journals (Sweden)

    Liwei Jing

    Full Text Available OBJECTIVES: Behavioral interventions are effective strategies for HIV/AIDS prevention and control. However, implementation of such strategies relies heavily on the accurate estimation of the high-risk population size. The multiplier method and generalized network scale-up method were recommended to estimate the population size of those at high risk for HIV by UNAIDS/WHO in 2003 and 2010, respectively. This study aims to assess and compare the two methods for estimating the size of populations at high risk for HIV, and to provide practical guidelines and suggestions for implementing the two methods. METHODS: Studies of the multiplier method used to estimate the population prevalence of men who have sex with men in China published between July 1, 2003 and July 1, 2013 were reviewed. The generalized network scale-up method was applied to estimate the population prevalence of men who have sex with men in the urban district of Taiyuan, China. RESULTS: The median of studies using the multiplier method to estimate the population prevalence of men who have sex with men in China was 4-8 times lower than the national level estimate. Meanwhile, the estimate of the generalized network scale-up method fell within the range of national level estimate. CONCLUSIONS: When high-quality existing data are not readily available, the multiplier method frequently yields underestimated results. We thus suggest that the generalized network scale-up method is preferred when sampling frames for the general population and accurate demographic information are available.

  13. An analysis of risk markers in husband to wife violence: the current state of knowledge.

    Science.gov (United States)

    Hotaling, G T; Sugarman, D B

    1986-01-01

    The present review involves the evaluation of 97 potential risk markers of husband to wife violence. Using 52 case-comparison studies as the source of data, markers were divided into four categories: consistent risk, inconsistent risk, consistent nonrisk, and risk markers with insufficient data. Based on this classification, it appears that a number of widely held hypotheses about husband to wife violence have little empirical support. Only witnessing violence in the wife's family of origin was consistently associated with being victimized by violence. Furthermore, it seems that characteristics associated with either the husband-offender or the couple have greater utility for assessing the risk of husband to wife violence than characteristics of the wife-victim. Findings are discussed in terms of the methodological and theoretical implications of current research on this form of adult domestic violence.

  14. Identifying patient preferences for communicating risk estimates: A descriptive pilot study

    Directory of Open Access Journals (Sweden)

    O'Connor Annette M

    2001-08-01

    Full Text Available Abstract Background Patients increasingly seek more active involvement in health care decisions, but little is known about how to communicate complex risk information to patients. The objective of this study was to elicit patient preferences for the presentation and framing of complex risk information. Method To accomplish this, eight focus group discussions and 15 one-on-one interviews were conducted, where women were presented with risk data in a variety of different graphical formats, metrics, and time horizons. Risk data were based on a hypothetical woman's risk for coronary heart disease, hip fracture, and breast cancer, with and without hormone replacement therapy. Participants' preferences were assessed using likert scales, ranking, and abstractions of focus group discussions. Results Forty peri- and postmenopausal women were recruited through hospital fliers (n = 25 and a community health fair (n = 15. Mean age was 51 years, 50% were non-Caucasian, and all had completed high school. Bar graphs were preferred by 83% of participants over line graphs, thermometer graphs, 100 representative faces, and survival curves. Lifetime risk estimates were preferred over 10 or 20-year horizons, and absolute risks were preferred over relative risks and number needed to treat. Conclusion Although there are many different formats for presenting and framing risk information, simple bar charts depicting absolute lifetime risk were rated and ranked highest overall for patient preferences for format.

  15. A case-control study estimating accident risk for alcohol, medicines and illegal drugs.

    Directory of Open Access Journals (Sweden)

    Kim Paula Colette Kuypers

    Full Text Available BACKGROUND: The aim of the present study was to assess the risk of having a traffic accident after using alcohol, single drugs, or a combination, and to determine the concentrations at which this risk is significantly increased. METHODS: A population-based case-control study was carried out, collecting whole blood samples of both cases and controls, in which a number of drugs were detected. The risk of having an accident when under the influence of drugs was estimated using logistic regression adjusting for gender, age and time period of accident (cases/sampling (controls. The main outcome measures were odds ratio (OR for accident risk associated with single and multiple drug use. In total, 337 cases (negative: 176; positive: 161 and 2726 controls (negative: 2425; positive: 301 were included in the study. RESULTS: Main findings were that 1 alcohol in general (all the concentrations together caused an elevated crash risk; 2 cannabis in general also caused an increase in accident risk; at a cut-off of 2 ng/mL THC the risk of having an accident was four times the risk associated with the lowest THC concentrations; 3 when ranking the adjusted OR from lowest to highest risk, alcohol alone or in combination with other drugs was related to a very elevated crash risk, with the highest risk for stimulants combined with sedatives. CONCLUSION: The study demonstrated a concentration-dependent crash risk for THC positive drivers. Alcohol and alcohol-drug combinations are by far the most prevalent substances in drivers and subsequently pose the largest risk in traffic, both in terms of risk and scope.

  16. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide-An Overview

    DEFF Research Database (Denmark)

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors...

  17. Estimation of the optimal statistical quality control sampling time intervals using a residual risk measure.

    Directory of Open Access Journals (Sweden)

    Aristides T Hatjimihail

    Full Text Available BACKGROUND: An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error. METHODOLOGY/PRINCIPAL FINDINGS: Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals. CONCLUSIONS/SIGNIFICANCE: It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed.

  18. Using expert judgment to estimate marine ecosystem vulnerability in the California Current.

    Science.gov (United States)

    Teck, Sarah J; Halpern, Benjamin S; Kappel, Carrie V; Micheli, Fiorenza; Selkoe, Kimberly A; Crain, Caitlin M; Martone, Rebecca; Shearer, Christine; Arvai, Joe; Fischhoff, Baruch; Murray, Grant; Neslo, Rabin; Cooke, Roger

    2010-07-01

    As resource management and conservation efforts move toward multi-sector, ecosystem-based approaches, we need methods for comparing the varying responses of ecosystems to the impacts of human activities in order to prioritize management efforts, allocate limited resources, and understand cumulative effects. Given the number and variety of human activities affecting ecosystems, relatively few empirical studies are adequately comprehensive to inform these decisions. Consequently, management often turns to expert judgment for information. Drawing on methods from decision science, we offer a method for eliciting expert judgment to (1) quantitatively estimate the relative vulnerability of ecosystems to stressors, (2) help prioritize the management of stressors across multiple ecosystems, (3) evaluate how experts give weight to different criteria to characterize vulnerability of ecosystems to anthropogenic stressors, and (4) identify key knowledge gaps. We applied this method to the California Current region in order to evaluate the relative vulnerability of 19 marine ecosystems to 53 stressors associated with human activities, based on surveys from 107 experts. When judging the relative vulnerability of ecosystems to stressors, we found that experts primarily considered two criteria: the ecosystem's resistance to the stressor and the number of species or trophic levels affected. Four intertidal ecosystems (mudflat, beach, salt marsh, and rocky intertidal) were judged most vulnerable to the suite of human activities evaluated here. The highest vulnerability rankings for coastal ecosystems were invasive species, ocean acidification, sea temperature change, sea level rise, and habitat alteration from coastal engineering, while offshore ecosystems were assessed to be most vulnerable to ocean acidification, demersal destructive fishing, and shipwrecks. These results provide a quantitative, transparent, and repeatable assessment of relative vulnerability across ecosystems to

  19. The impact of individualised cardiovascular disease (CVD risk estimates and lifestyle advice on physical activity in individuals at high risk of CVD: a pilot 2 × 2 factorial understanding risk trial

    Directory of Open Access Journals (Sweden)

    Griffin Simon J

    2008-07-01

    Full Text Available Abstract Background There is currently much interest in encouraging individuals to increase physical activity in order to reduce CVD risk. This study has been designed to determine if personalised CVD risk appreciation can increase physical activity in adults at high risk of CVD. Methods/Design In a 2 × 2 factorial design participants are allocated at random to a personalised 10-year CVD risk estimate or numerical CVD risk factor values (systolic blood pressure, LDL cholesterol and fasting glucose and, simultaneously, to receive a brief lifestyle advice intervention targeting physical activity, diet and smoking cessation or not. We aim to recruit 200 participants from Oxfordshire primary care practices. Eligibility criteria include adults age 40–70 years, estimated 10-year CVD risk ≥20%, ability to read and write English, no known CVD and no physical disability or other condition reducing the ability to walk. Primary outcome is physical activity measured by ActiGraph accelerometer with biochemical, anthropometrical and psychological measures as additional outcomes. Primary analysis is between group physical activity differences at one month powered to detect a difference of 30,000 total counts per day of physical activity between the groups. Additional analyses will seek to further elucidate the relationship between the provision of risk information, and intention to change behaviour and to determine the impact of both interventions on clinical and anthropometrical measures including fasting and 2 hour plasma glucose, fructosamine, serum cotinine, plasma vitamin C, body fat percentage and blood pressure. Discussion This is a pilot trial seeking to demonstrate in a real world setting, proof of principal that provision of personalised risk information can contribute to behaviour changes aimed at reducing CVD risk. This study will increase our understanding of the links between the provision of risk information and behaviour change and if

  20. Estimated risk from exposure to radon decay products in US homes

    Energy Technology Data Exchange (ETDEWEB)

    Nero, A.V. Jr.

    1986-05-01

    Recent analyses now permit direct estimation of the risks of lung cancer from radon decay products in US homes. Analysis of data from indoor monitoring in single-family homes yields a tentative frequency distribution of annual-average /sup 222/Rn concentrations averaging 55 Bq m/sup -3/ and having 2% of homes exceeding 300 Bq m/sup -3/. Application of the results of occupational epidemiological studies, either directly or using recent advances in lung dosimetry, to indoor exposures suggests that the average indoor concentration entails a lifetime risk of lung cancer of 0.3% or about 10% of the total risk of lung cancer. The risk to individuals occupying the homes with 300 Bq m/sup -3/ or more for their lifetimes is estimated to exceed 2%, with risks from the homes with thousands of Bq m/sup -3/ correspondingly higher, even exceeding the total risk of premature death due to cigarette smoking. The potential for such average and high-level risks in ordinary homes forces development of a new perspective on environmental exposures.

  1. Caries risk estimation in children regarding values of saliva buffer system components and carboanhydrase activity

    Directory of Open Access Journals (Sweden)

    Šurdilović Dušan

    2008-01-01

    Full Text Available Background/Aim. One of the preconditions for efficacious systematic reduction of caries prevalence and prophylaxis is the determination of risks of this disease appearance. The aim of this study was to prove the significance of salivary carboanhydrase activity determination in estimation of caries risk in children. Methods. The study included 123 children of average age of 13.4±0.3 years and permanent dentition. The children were divided into two groups according to caries risk (low and high caries risk groups. Two samples of saliva - unstimulated and stimulated one were taken from each child. Salivary carboanhydrase activity, as well as pH value, bicarbonate and phosphate buffer levels were estimated in both group of saliva samples. Results. The investigation showed significantly higher carboanhydrase activity (p < 0.001 in both saliva samples in low caries risk group compared to high caries risk one. In children with low caries risk, both unstimulated and stimulated saliva show significantly higher bicarbonate and phosphate buffer concentrations (p < 0.001, as well as pH values. Conclusion. The lower caries incidence could be expected in children with high carboanhydrase activity and higher salivary buffer system parameters levels. The presented results suggest that salivary carboanhydrase activity represents the important marker of individual susceptibility for caries appearance in children.

  2. Estimates for the Finite-time Ruin Probability with Insurance and Financial Risks

    Institute of Scientific and Technical Information of China (English)

    Min ZHOU; Kai-yong WANG; Yue-bao WANG

    2012-01-01

    The paper gives estimates for the finite-time ruin probability with insurance and financial risks.When the distribution of the insurance risk belongs to the class (L)(γ) for some γ > 0 or the subexponential distribution class,we abtain some asymptotic equivalent relationships for the finite-time ruin probability,respectively. When the distribution of the insurance risk belongs to the dominated varying-tailed distribution class,we obtain asymptotic upper bound and lower bound for the finite-time ruin probability,where for the asymptotic upper bound,we completely get rid of the restriction of mutual independence on insurance risks,and for the lower bound,we only need the insurance risks to have a weak positive association structure.The obtained results extend and improve some existing results.

  3. Dynamic Estimation of Volatility Risk Premia and Investor Risk Aversion from Option-Implied and Realized Volatilities

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Gibson, Michael; Zhou, Hao

    This paper proposes a method for constructing a volatility risk premium, or investor risk aversion, index. The method is intuitive and simple to implement, relying on the sample moments of the recently popularized model-free realized and option-implied volatility measures. A small-scale Monte Car...... relate to a set of macro-finance state variables. We also find that the extracted volatility risk premium helps predict future stock market returns.......This paper proposes a method for constructing a volatility risk premium, or investor risk aversion, index. The method is intuitive and simple to implement, relying on the sample moments of the recently popularized model-free realized and option-implied volatility measures. A small-scale Monte Carlo...... experiment confirms that the procedure works well in practice. Implementing the procedure with actual S&P500 option-implied volatilities and high-frequency five-minute-based realized volatilities indicates significant temporal dependencies in the estimated stochastic volatility risk premium, which we in turn...

  4. Correction: No Child Left Alone: Moral Judgments about Parents Affect Estimates of Risk to Children

    Directory of Open Access Journals (Sweden)

    Ashley J. Thomas

    2016-10-01

    Full Text Available This article details a correction to article Thomas, A. J., Stanford, P. K., & Sarnecka, B. W. (2016. No Child Left Alone: Moral Judgments about Parents Affect Estimates of Risk to Children. 'Collabra', 2(1, 10. DOI: http://doi.org/10.1525/collabra.33

  5. Uniform estimate for maximum of randomly weighted sums with applications to insurance risk theory

    Institute of Scientific and Technical Information of China (English)

    WANG Dingcheng; SU Chun; ZENG Yong

    2005-01-01

    This paper obtains the uniform estimate for maximum of sums of independent and heavy-tailed random variables with nonnegative random weights, which can be arbitrarily dependent of each other. Then the applications to ruin probabilities in a discrete time risk model with dependent stochastic returns are considered.

  6. A simultaneous approach to the estimation of risk aversion and the subjective time discount rate

    NARCIS (Netherlands)

    A.S. Booij; B.M.S. van Praag

    2009-01-01

    In this paper we analyze a sample of 1832 individuals who responded to six randomly generated lottery questions that differ with respect to chance, prize and the timing of the draw. Using a model that explicitly allows for consumption smoothing, we obtain an estimate of relative risk aversion of 82.

  7. A-BOMB SURVIVOR SITE-SPECIFIC RADIOGENIC CANCER RISKS ESTIMATES

    Science.gov (United States)

    A draft manuscript is being prepared that describes ways to improve estimates of risk from radiation that have been derived from A-bomb survivors. The work has been published in the journal Radiation Research volume 169, pages 87-98.

  8. Artificial force fields for multi-agent simulations of maritime traffic and risk estimation

    NARCIS (Netherlands)

    Xiao, F.; Ligteringen, H.; Van Gulijk, C.; Ale, B.J.M.

    2012-01-01

    A probabilistic risk model is designed to estimate probabilities of collisions for shipping accidents in busy waterways. We propose a method based on multi-agent simulation that uses an artificial force field to model ship maneuvers. The artificial force field is calibrated by AIS data (Automatic Id

  9. Estimating Systematic Risk With Long-Term Growth Forecasts and Analyst Following

    OpenAIRE

    Richard J. Dowen

    1992-01-01

    According to the capital asset pricing model, a stockÕs required rate of return is determined by systematic risk, otherwise known as beta. Usually betas are estimated using historic data. It is shown here that future betas have a stronger relationship to analystsÕ long-term growth forecast than to historic betas.

  10. Estimating adolescent risk for hearing loss based on data from a large school-based survey

    NARCIS (Netherlands)

    I. Vogel (Ineke); H. Verschuure (Hans); C.P.B. van der Ploeg (Catharina); J. Brug (Hans); H. Raat (Hein)

    2010-01-01

    textabstractObjectives. We estimated whether and to what extent a group of adolescents were at risk of developing permanent hearing loss as a result of voluntary exposure to high-volume music, and we assessed whether such exposure was associated with hearing-related symptoms. Methods. In 2007, 1512

  11. Estimating adolescent risk for hearing loss based on data from a large school-based survey

    NARCIS (Netherlands)

    Vogel, L.; Verschuure, H.; Ploeg, C.P.B. van der; Brug, J.; Raat, H.

    2010-01-01

    Objectives. We estimated whether and to what extent a group of adolescents were at risk of developing permanent hearing loss as a result of voluntary exposure to high-volume music, and we assessed whether such exposure was associated with hearing-related symptoms. Methods. In 2007, 1512 adolescents

  12. Risk Assessment, Partition and Economic Loss Estimation of Rice Production in China

    Directory of Open Access Journals (Sweden)

    Qiqi Chen

    2015-01-01

    Full Text Available Agricultural risk, especially the risk assessment, partition and economic loss estimation of specific and main crops, maize, wheat and rice, is widely touted in China as a means of improving the effective productivity. The main objective of this article is to perform a detailed analysis of the stability and comparative advantage of rice production in 30 provinces on the basis of relative rice production data from 2000 to 2012 in China. The non-parametric information diffusion model based on entropy theory was used to assess rice production risk. Accordingly, we divided the risk level with hierarchical cluster analysis. Then, we calculated the economic loss of rice production by the scenario analysis. The results show that, firstly, the national disaster risk of rice production is at a lower level. Secondly, there are significant differences in the stability, comparative advantage and risk probability of rice production among the 30 provinces. Thirdly, Shanxi province belongs to the high risk zone, 12 provinces belong to the middle risk zone and 17 provinces to the low risk zone. Finally, there is a proportional rate between the economic loss (yield loss and disaster area (yield loss rate of rice production. Therefore, we could obtain some significant policy suggestions.

  13. Methods for Management of Innovation Activity Risks in the Current Conditions

    Directory of Open Access Journals (Sweden)

    Pysmak Viktoriia O.

    2016-03-01

    Full Text Available The article considers theoretical foundations of management of innovation activity risks. Relevance of the selected topic of research in connection with both volatile external environment and crisis developments in the country's economy has been substantiated. A semantic analysis of the concept of «risk» has been done, showing the presence of both positive and negative sides of risk situations. A scheme for enterprise management in the light of implementing innovation activity has been elaborated. An improved classification of innovation activity has been provided, creating the opportunity to focus on identifying risks in the process of allocating a specific direction of innovation activity. The main stages of identification and analysis of risks of innovation activity has been allocated. Methods for management of innovation activity risks in the current conditions have been developed. A concept of «system for management of innovation activity risks» has been formulated, its major features for enterprise has been outlined. The main requirements for a system for management of innovation activity risks have been allocated.

  14. Risk factors for venous thrombosis - current understanding from an epidemiological point of view

    NARCIS (Netherlands)

    Lijfering, Willem M.; Rosendaal, Frits R.; Cannegieter, Suzanne C.

    2010-01-01

    P>Epidemiological research throughout the last 50 years has provided the long list of risk factors for venous thrombosis that are known today. Although this has advanced our current understanding about the aetiology of thrombosis, it does not give us all the answers: many people have several of thes

  15. A Risk Metric Assessment of Scenario-Based Market Risk Measures for Volatility and Risk Estimation: Evidence from Emerging Markets

    Directory of Open Access Journals (Sweden)

    Sitima Innocent

    2015-03-01

    Full Text Available The study evaluated the sensitivity of the Value- at- Risk (VaR and Expected Shortfalls (ES with respect to portfolio allocation in emerging markets with an index portfolio of a developed market. This study utilised different models for VaR and ES techniques using various scenario-based models such as Covariance Methods, Historical Simulation and the GARCH (1, 1 for the predictive ability of these models in both relatively stable market conditions and extreme market conditions. The results showed that Expected Shortfall has less risk tolerance than VaR based on the same scenario-based market risk measures

  16. Hip fracture risk estimation based on principal component analysis of QCT atlas: a preliminary study

    Science.gov (United States)

    Li, Wenjun; Kornak, John; Harris, Tamara; Lu, Ying; Cheng, Xiaoguang; Lang, Thomas

    2009-02-01

    We aim to capture and apply 3-dimensional bone fragility features for fracture risk estimation. Using inter-subject image registration, we constructed a hip QCT atlas comprising 37 patients with hip fractures and 38 age-matched controls. In the hip atlas space, we performed principal component analysis to identify the principal components (eigen images) that showed association with hip fracture. To develop and test a hip fracture risk model based on the principal components, we randomly divided the 75 QCT scans into two groups, one serving as the training set and the other as the test set. We applied this model to estimate a fracture risk index for each test subject, and used the fracture risk indices to discriminate the fracture patients and controls. To evaluate the fracture discrimination efficacy, we performed ROC analysis and calculated the AUC (area under curve). When using the first group as the training group and the second as the test group, the AUC was 0.880, compared to conventional fracture risk estimation methods based on bone densitometry, which had AUC values ranging between 0.782 and 0.871. When using the second group as the training group, the AUC was 0.839, compared to densitometric methods with AUC values ranging between 0.767 and 0.807. Our results demonstrate that principal components derived from hip QCT atlas are associated with hip fracture. Use of such features may provide new quantitative measures of interest to osteoporosis.

  17. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    Science.gov (United States)

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  18. Estimation of current cumulative incidence of leukaemia-free patients and current leukaemia-free survival in chronic myeloid leukaemia in the era of modern pharmacotherapy

    Directory of Open Access Journals (Sweden)

    Trněný Marek

    2011-10-01

    Full Text Available Abstract Background The current situation in the treatment of chronic myeloid leukaemia (CML presents a new challenge for attempts to measure the therapeutic results, as the CML patients can experience multiple leukaemia-free periods during the course of their treatment. Traditional measures of treatment efficacy such as leukaemia-free survival and cumulative incidence are unable to cope with multiple events in time, e.g. disease remissions or progressions, and as such are inappropriate for the efficacy assessment of the recent CML treatment. Methods Standard nonparametric statistical methods are used for estimating two principal characteristics of the current CML treatment: the probability of being alive and leukaemia-free in time after CML therapy initiation, denoted as the current cumulative incidence of leukaemia-free patients; and the probability that a patient is alive and in any leukaemia-free period in time after achieving the first leukaemia-free period on the CML treatment, denoted as the current leukaemia-free survival. The validity of the proposed methods is further documented in the data of the Czech CML patients consecutively recorded between July 2003 and July 2009 as well as in simulated data. Results The results have shown a difference between the estimates of the current cumulative incidence function and the common cumulative incidence of leukaemia-free patients, as well as between the estimates of the current leukaemia-free survival and the common leukaemia-free survival. Regarding the currently available follow-up period, both differences have reached the maximum (12.8% and 20.8%, respectively at 3 years after the start of follow-up, i.e. after the CML therapy initiation in the former case and after the first achievement of the disease remission in the latter. Conclusions Two quantities for the evaluation of the efficacy of current CML therapy that may be estimated with standard nonparametric methods have been proposed in

  19. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    Science.gov (United States)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  20. Estimating the abortion risk difference in Neospora caninum seropositive dairy cattle in Brazil

    Directory of Open Access Journals (Sweden)

    Rafael Romero Nicolino

    2015-09-01

    Full Text Available Neosporosis in cattle herds is associated with large economic losses, with abortion being the only clinical sign perceptible to the producer. Losses are estimated at over one billion dollars worldwide. This study aimed to estimate the abortion risk difference in seropositive animals using specific data for dairy herds in Brazil. Differences in the risk of abortion between seropositive and seronegative animals were calculated through a meta-analysis of previous data from several Brazilian states, and an increase of 10.04% (0.091 to 0.118 in the specific risk was identified. This finding indicates that more than 474,000 abortions caused by neosporosis may be occurring only in dairy cattle herds in Brazil, causing a major economic loss in the milk production chain. The use of this specific measure for Brazilian herds opens the possibility of developing cost-benefit analysis for neosporosis in Brazil using data that are more reliable

  1. Estimating past and current attendance at winter sports areas...a pilot study

    Science.gov (United States)

    Richard L. Bury; James W. Hall

    1963-01-01

    Routine business records of towlift tickets or restaurant receipts provided estimates of total attendance over a 2-month period within 8 percent of true attendance, and attendance on an average day within 18 to 24 percent of true attendance. The chances were that estimates would fall within these limits 2 out of 3 times. Guides for field use can be worked out after...

  2. Counselees' Expressed Level of Understanding of the Risk Estimate and Surveillance Recommendation are Not Associated with Breast Cancer Surveillance Adherence

    NARCIS (Netherlands)

    Albada, Akke; van Dulmen, Sandra; Dijkstra, Henrietta; Wieffer, Ivette; Witkamp, Arjen; Ausems, Margreet G E M

    We studied counselees' expressed understanding of the risk estimate and surveillance recommendation in the final consultation for breast cancer genetic counseling in relation with their risk perception, worry and cancer surveillance adherence 1 year post-counseling. Consecutive counselees were

  3. Counselees' Expressed Level of Understanding of the Risk Estimate and Surveillance Recommendation are Not Associated with Breast Cancer Surveillance Adherence

    NARCIS (Netherlands)

    Albada, A.; Dulmen, S. van; Dijkstra, H.; Wieffer, I.; Witkamp, A.; Ausems, M.G.

    2016-01-01

    We studied counselees' expressed understanding of the risk estimate and surveillance recommendation in the final consultation for breast cancer genetic counseling in relation with their risk perception, worry and cancer surveillance adherence 1 year post-counseling. Consecutive counselees were

  4. Counselees’ expressed level of understanding of the risk estimate and surveillance recommendation are not associated with breast cancer surveillance adherence.

    NARCIS (Netherlands)

    Albada, A.; Dulmen, S. van; Dijkstra, H.; Wieffer, I.; Witkamp, A.; Ausems, M.G.E.M.

    2016-01-01

    We studied counselees’ expressed understanding of the risk estimate and surveillance recommendation in the final consultation for breast cancer genetic counseling in relation with their risk perception, worry and cancer surveillance adherence 1 year post-counseling. Consecutive counselees were

  5. Estimated GFR associates with cardiovascular risk factors independently of measured GFR.

    Science.gov (United States)

    Mathisen, Ulla Dorte; Melsom, Toralf; Ingebretsen, Ole C; Jenssen, Trond; Njølstad, Inger; Solbu, Marit D; Toft, Ingrid; Eriksen, Bjørn O

    2011-05-01

    Estimation of the GFR (eGFR) using creatinine- or cystatin C-based equations is imperfect, especially when the true GFR is normal or near-normal. Modest reductions in eGFR from the normal range variably predict cardiovascular morbidity. If eGFR associates not only with measured GFR (mGFR) but also with cardiovascular risk factors, the effects of these non-GFR-related factors might bias the association between eGFR and outcome. To investigate these potential non-GFR-related associations between eGFR and cardiovascular risk factors, we measured GFR by iohexol clearance in a sample from the general population (age 50 to 62 years) without known cardiovascular disease, diabetes, or kidney disease. Even after adjustment for mGFR, eGFR associated with traditional cardiovascular risk factors in multiple regression analyses. More risk factors influenced cystatin C-based eGFR than creatinine-based eGFR, adjusted for mGFR, and some of the risk factors exhibited nonlinear effects in generalized additive models (PGFR. Thus, estimates of cardiovascular risk associated with small changes in eGFR must be interpreted with caution.

  6. Current Roles and Future Applications of Cardiac CT: Risk Stratification of Coronary Artery Disease

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yeonyee Elizabeth [Department of Cardiology, Cardiovascular Center, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of); Lim, Tae-Hwan [Department of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul 138-736 (Korea, Republic of)

    2014-07-01

    Cardiac computed tomography (CT) has emerged as a noninvasive modality for the assessment of coronary artery disease (CAD), and has been rapidly integrated into clinical cares. CT has changed the traditional risk stratification based on clinical risk to image-based identification of patient risk. Cardiac CT, including coronary artery calcium score and coronary CT angiography, can provide prognostic information and is expected to improve risk stratification of CAD. Currently used conventional cardiac CT, provides accurate anatomic information but not functional significance of CAD, and it may not be sufficient to guide treatments such as revascularization. Recently, myocardial CT perfusion imaging, intracoronary luminal attenuation gradient, and CT-derived computed fractional flow reserve were developed to combine anatomical and functional data. Although at present, the diagnostic and prognostic value of these novel technologies needs to be evaluated further, it is expected that all-in-one cardiac CT can guide treatment and improve patient outcomes in the near future.

  7. Evolving Paradigm of Radiotherapy for High-Risk Prostate Cancer: Current Consensus and Continuing Controversies

    Directory of Open Access Journals (Sweden)

    Aditya Juloori

    2016-01-01

    Full Text Available High-risk prostate cancer is an aggressive form of the disease with an increased risk of distant metastasis and subsequent mortality. Multiple randomized trials have established that the combination of radiation therapy and long-term androgen deprivation therapy improves overall survival compared to either treatment alone. Standard of care for men with high-risk prostate cancer in the modern setting is dose-escalated radiotherapy along with 2-3 years of androgen deprivation therapy (ADT. There are research efforts directed towards assessing the efficacy of shorter ADT duration. Current research has been focused on assessing hypofractionated and stereotactic body radiation therapy (SBRT techniques. Ongoing randomized trials will help assess the utility of pelvic lymph node irradiation. Research is also focused on multimodality therapy with addition of a brachytherapy boost to external beam radiation to help improve outcomes in men with high-risk prostate cancer.

  8. Estimating cancer risk in relation to tritium exposure from routine operation of a nuclear-generating station in Pickering, Ontario.

    Science.gov (United States)

    Wanigaratne, S; Holowaty, E; Jiang, H; Norwood, T A; Pietrusiak, M A; Brown, P

    2013-09-01

    Evidence suggests that current levels of tritium emissions from CANDU reactors in Canada are not related to adverse health effects. However, these studies lack tritium-specific dose data and have small numbers of cases. The purpose of our study was to determine whether tritium emitted from a nuclear-generating station during routine operation is associated with risk of cancer in Pickering, Ontario. A retrospective cohort was formed through linkage of Pickering and north Oshawa residents (1985) to incident cancer cases (1985-2005). We examined all sites combined, leukemia, lung, thyroid and childhood cancers (6-19 years) for males and females as well as female breast cancer. Tritium estimates were based on an atmospheric dispersion model, incorporating characteristics of annual tritium emissions and meteorology. Tritium concentration estimates were assigned to each cohort member based on exact location of residence. Person-years analysis was used to determine whether observed cancer cases were higher than expected. Cox proportional hazards regression was used to determine whether tritium was associated with radiation-sensitive cancers in Pickering. Person-years analysis showed female childhood cancer cases to be significantly higher than expected (standardized incidence ratio [SIR] = 1.99, 95% confidence interval [CI]: 1.08-3.38). The issue of multiple comparisons is the most likely explanation for this finding. Cox models revealed that female lung cancer was significantly higher in Pickering versus north Oshawa (HR = 2.34, 95% CI: 1.23-4.46) and that tritium was not associated with increased risk. The improved methodology used in this study adds to our understanding of cancer risks associated with low-dose tritium exposure. Tritium estimates were not associated with increased risk of radiationsensitive cancers in Pickering.

  9. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    Science.gov (United States)

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Cancer risk estimation in Digital Breast Tomosynthesis using GEANT4 Monte Carlo simulations and voxel phantoms.

    Science.gov (United States)

    Ferreira, P; Baptista, M; Di Maria, S; Vaz, P

    2016-05-01

    The aim of this work was to estimate the risk of radiation induced cancer following the Portuguese breast screening recommendations for Digital Mammography (DM) when applied to Digital Breast Tomosynthesis (DBT) and to evaluate how the risk to induce cancer could influence the energy used in breast diagnostic exams. The organ doses were calculated by Monte Carlo simulations using a female voxel phantom and considering the acquisition of 25 projection images. Single organ cancer incidence risks were calculated in order to assess the total effective radiation induced cancer risk. The screening strategy techniques considered were: DBT in Cranio-Caudal (CC) view and two-view DM (CC and Mediolateral Oblique (MLO)). The risk of cancer incidence following the Portuguese screening guidelines (screening every two years in the age range of 50-80years) was calculated by assuming a single CC DBT acquisition view as standalone screening strategy and compared with two-view DM. The difference in the total effective risk between DBT and DM is quite low. Nevertheless in DBT an increase of risk for the lung is observed with respect to DM. The lung is also the organ that is mainly affected when non-optimal beam energy (in terms of image quality and absorbed dose) is used instead of an optimal one. The use of non-optimal energies could increase the risk of lung cancer incidence by a factor of about 2.

  11. Bias of using odds ratio estimates in multinomial logistic regressions to estimate relative risk or prevalence ratio and alternatives

    Directory of Open Access Journals (Sweden)

    Suzi Alves Camey

    2014-01-01

    Full Text Available Recent studies have emphasized that there is no justification for using the odds ratio (OR as an approximation of the relative risk (RR or prevalence ratio (PR. Erroneous interpretations of the OR as RR or PR must be avoided, as several studies have shown that the OR is not a good approximation for these measures when the outcome is common (> 10%. For multinomial outcomes it is usual to use the multinomial logistic regression. In this context, there are no studies showing the impact of the approximation of the OR in the estimates of RR or PR. This study aimed to present and discuss alternative methods to multinomial logistic regression based upon robust Poisson regression and the log-binomial model. The approaches were compared by simulating various possible scenarios. The results showed that the proposed models have more precise and accurate estimates for the RR or PR than the multinomial logistic regression, as in the case of the binary outcome. Thus also for multinomial outcomes the OR must not be used as an approximation of the RR or PR, since this may lead to incorrect conclusions.

  12. A method for estimating tokamak poloidal field coil currents which incorporates engineering constraints

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, W.A.

    1990-05-01

    This thesis describes the development of a design tool for the poloidal field magnet system of a tokamak. Specifically, an existing program for determining the poloidal field coil currents has been modified to: support the general case of asymmetric equilibria and coil sets, determine the coil currents subject to constraints on the maximum values of those currents, and determine the coil currents subject to limits on the forces those coils may carry. The equations representing the current limits and coil force limits are derived and an algorithm based on Newton's method is developed to determine a set of coil currents which satisfies those limits. The resulting program allows the designer to quickly determine whether or not a given coil set is capable of supporting a given equilibrium. 25 refs.

  13. Estimating the country risk premium in emerging markets: the case of the Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Aleksandar Naumoski

    2012-12-01

    Full Text Available Estimation of the cost of capital is difficult in developed markets and even more difficult in emerging markets. Investments in the emerging markets are more risky than in the developed markets but return is also higher. The key question here is whether the return on investments in emerging markets should be rewarded by compensation in excess of that provided by an equivalent investment in a developed market. Contemporary literature provides alternative ways for calculating the cost of capital invested in emerging markets. In general, it can be concluded that it is widely accepted that country risk matters when investing in emerging markets and it is a key component in the estimation of the cost of capital for those investments. Country risk is non-diversifiable, which will be argued in this paper first, after which an alternative approach will be provided for quantification of country risk in the risk premium measure, which is integral component in the models for estimating the cost of capital.

  14. The 12-lead electrocardiogram and risk of sudden death: current utility and future prospects.

    Science.gov (United States)

    Narayanan, Kumar; Chugh, Sumeet S

    2015-10-01

    More than 100 years after it was first invented, the 12-lead electrocardiogram (ECG) continues to occupy an important place in the diagnostic armamentarium of the practicing clinician. With the recognition of relatively rare but important clinical entities such as Wolff-Parkinson-White and the long QT syndrome, this clinical tool was firmly established as a test for assessing risk of sudden cardiac death (SCD). However, over the past two decades the role of the ECG in risk prediction for common forms of SCD, for example in patients with coronary artery disease, has been the focus of considerable investigation. Especially in light of the limitations of current risk stratification approaches, there is a renewed focus on this broadly available and relatively inexpensive test. Various abnormalities of depolarization and repolarization on the ECG have been linked to SCD risk; however, more focused work is needed before they can be deployed in the clinical arena. The present review summarizes the current knowledge on various ECG risk markers for prediction of SCD and discusses some future directions in this field.

  15. Current drinking and health-risk behaviors among male high school students in central Thailand

    Directory of Open Access Journals (Sweden)

    Pichainarong Natchaporn

    2011-04-01

    Full Text Available Abstract Background Alcohol drinking is frequently related to behavioral problems, which lead to a number of negative consequences. This study was to evaluate the characteristics of male high school students who drink, the drinking patterns among them, and the associations between current drinking and other health risk behaviors which focused on personal safety, violence-related behaviors, suicide and sexual behaviors. Method A cross-sectional study was conducted to explore current alcohol drinking and health-risk behaviors among male high school students in central Thailand. Five thousand one hundred and eighty four male students were classified into 2 groups according to drinking in the previous 30 days (yes = 631, no = 4,553. Data were collected by self-administered, anonymous questionnaire which consisted of 3 parts: socio-demographic factors, health-risk behaviors and alcohol drinking behavior during the past year from December 2007 to February 2008. Results The results showed that the percent of current drinking was 12.17. Most of them were 15-17 years (50.21%. Socio-demographic factors such as age, educational level, residence, cohabitants, grade point average (GPA, having a part time job and having family members with alcohol/drug problems were significantly associated with alcohol drinking (p Conclusions An increased risk of health-risk behaviors, including driving vehicles after drinking, violence-related behaviors, sad feelings and attempted suicide, and sexual behaviors was higher among drinking students that led to significant health problems. Effective intervention strategies (such as a campaign mentioning the adverse health effects and social consequences to the risk groups, and encouraging parental and community efforts to prevent drinking among adolescents should be implemented to prevent underage drinking and adverse consequences.

  16. Underestimating the Alcohol Content of a Glass of Wine: The Implications for Estimates of Mortality Risk

    Science.gov (United States)

    Britton, Annie; O’Neill, Darragh; Bell, Steven

    2016-01-01

    Aims Increases in glass sizes and wine strength over the last 25 years in the UK are likely to have led to an underestimation of alcohol intake in population studies. We explore whether this probable misclassification affects the association between average alcohol intake and risk of mortality from all causes, cardiovascular disease and cancer. Methods Self-reported alcohol consumption in 1997–1999 among 7010 men and women in the Whitehall II cohort of British civil servants was linked to the risk of mortality until mid-2015. A conversion factor of 8 g of alcohol per wine glass (1 unit) was compared with a conversion of 16 g per wine glass (2 units). Results When applying a higher alcohol content conversion for wine consumption, the proportion of heavy/very heavy drinkers increased from 28% to 41% for men and 15% to 28% for women. There was a significantly increased risk of very heavy drinking compared with moderate drinking for deaths from all causes and cancer before and after change in wine conversion; however, the hazard ratios were reduced when a higher wine conversion was used. Conclusions In this population-based study, assuming higher alcohol content in wine glasses changed the estimates of mortality risk. We propose that investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. Prospectively, researchers need to collect more detailed information on alcohol including serving sizes and strength. Short summary The alcohol content in a wine glass is likely to be underestimated in population surveys as wine strength and serving size have increased in recent years. We demonstrate that in a large cohort study, this underestimation affects estimates of mortality risk. Investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. PMID:27261472

  17. Fatalities in high altitude mountaineering: a review of quantitative risk estimates.

    Science.gov (United States)

    Weinbruch, Stephan; Nordby, Karl-Christian

    2013-12-01

    Quantitative estimates for mortality in high altitude mountaineering are reviewed. Special emphasis is placed on the heterogeneity of the risk estimates and on confounding. Crude estimates for mortality are on the order of 1/1000 to 40/1000 persons above base camp, for both expedition members and high altitude porters. High altitude porters have mostly a lower risk than expedition members (risk ratio for all Nepalese peaks requiring an expedition permit: 0.73; 95 % confidence interval 0.59-0.89). The summit bid is generally the most dangerous part of an expedition for members, whereas most high altitude porters die during route preparation. On 8000 m peaks, the mortality during descent from summit varies between 4/1000 and 134/1000 summiteers (members plus porters). The risk estimates are confounded by human and environmental factors. Information on confounding by gender and age is contradictory and requires further work. There are indications for safety segregation of men and women, with women being more risk averse than men. Citizenship appears to be a significant confounder. Prior high altitude mountaineering experience in Nepal has no protective effect. Commercial expeditions in the Nepalese Himalayas have a lower mortality than traditional expeditions, though after controlling for confounding, the difference is not statistically significant. The overall mortality is increasing with increasing peak altitude for expedition members but not for high altitude porters. In the Nepalese Himalayas and in Alaska, a significant decrease of mortality with calendar year was observed. A few suggestions for further work are made at the end of the article.

  18. Errors of Mean Dynamic Topography and Geostrophic Current Estimates in China's Marginal Seas from GOCE and Satellite Altimetry

    DEFF Research Database (Denmark)

    Jin, Shuanggen; Feng, Guiping; Andersen, Ole Baltazar

    2014-01-01

    and geostrophic current estimates from satellite gravimetry and altimetry are investigated and evaluated in China's marginal seas. The cumulative error in MDT from GOCE is reduced from 22.75 to 9.89 cm when compared to the Gravity Recovery and Climate Experiment (GRACE) gravity field model ITG-Grace2010 results...

  19. Method of Measuring Common-Mode Current Conversion Coefficient for Estimating Variation in Radiated Emission from Printed Circuit Board Components

    Directory of Open Access Journals (Sweden)

    C. Ho

    2014-06-01

    Full Text Available This work describes the measurement of the common-mode current conversion coefficient for a microstrip line with solid and slotted ground planes by using a VNA with a BCI probe. The radiated emissions estimated by the common-mode current conversion coefficient are further compared with those obtained by the FAC measurements. Furthermore, the proposed method was used to estimate radiated emissions from a microstrip bandpass filter. For all of the case studies, results of electromagnetic (EM simulation demonstrate the validity of the measurement results by the proposed method. Highly promising for use in EMI measurement application, the proposed method can estimate the radiated emissions by miniaturized microstrip components on a PCB when pre-tested for compliance with EMI regulations.

  20. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    Science.gov (United States)

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  1. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    Science.gov (United States)

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  2. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    Directory of Open Access Journals (Sweden)

    Jiechen Tang

    2015-01-01

    Full Text Available This paper concentrates on estimating the risk of Title Transfer Facility (TTF Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR and conditional value at risk (CVaR. Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  3. Exploring the cost-utility of stratified primary care management for low back pain compared with current best practice within risk-defined subgroups.

    Science.gov (United States)

    Whitehurst, David G T; Bryan, Stirling; Lewis, Martyn; Hill, Jonathan; Hay, Elaine M

    2012-11-01

    Stratified management for low back pain according to patients' prognosis and matched care pathways has been shown to be an effective treatment approach in primary care. The aim of this within-trial study was to determine the economic implications of providing such an intervention, compared with non-stratified current best practice, within specific risk-defined subgroups (low-risk, medium-risk and high-risk). Within a cost-utility framework, the base-case analysis estimated the incremental healthcare cost per additional quality-adjusted life year (QALY), using the EQ-5D to generate QALYs, for each risk-defined subgroup. Uncertainty was explored with cost-utility planes and acceptability curves. Sensitivity analyses were performed to consider alternative costing methodologies, including the assessment of societal loss relating to work absence and the incorporation of generic (ie, non-back pain) healthcare utilisation. The stratified management approach was a cost-effective treatment strategy compared with current best practice within each risk-defined subgroup, exhibiting dominance (greater benefit and lower costs) for medium-risk patients and acceptable incremental cost to utility ratios for low-risk and high-risk patients. The likelihood that stratified care provides a cost-effective use of resources exceeds 90% at willingness-to-pay thresholds of £4000 (≈ 4500; $6500) per additional QALY for the medium-risk and high-risk groups. Patients receiving stratified care also reported fewer back pain-related days off work in all three subgroups. Compared with current best practice, stratified primary care management for low back pain provides a highly cost-effective use of resources across all risk-defined subgroups.

  4. Health risk estimates for groundwater and soil contamination in the Slovak Republic: a convenient tool for identification and mapping of risk areas.

    Science.gov (United States)

    Fajčíková, K; Cvečková, V; Stewart, A; Rapant, S

    2014-10-01

    We undertook a quantitative estimation of health risks to residents living in the Slovak Republic and exposed to contaminated groundwater (ingestion by adult population) and/or soils (ingestion by adult and child population). Potential risk areas were mapped to give a visual presentation at basic administrative units of the country (municipalities, districts, regions) for easy discussion with policy and decision-makers. The health risk estimates were calculated by US EPA methods, applying threshold values for chronic risk and non-threshold values for cancer risk. The potential health risk was evaluated for As, Ba, Cd, Cu, F, Hg, Mn, NO3 (-), Pb, Sb, Se and Zn for groundwater and As, B, Ba, Be, Cd, Cu, F, Hg, Mn, Mo, Ni, Pb, Sb, Se and Zn for soils. An increased health risk was identified mainly in historical mining areas highly contaminated by geogenic-anthropogenic sources (ore deposit occurrence, mining, metallurgy). Arsenic and antimony were the most significant elements in relation to health risks from groundwater and soil contamination in the Slovak Republic contributing a significant part of total chronic risk levels. Health risk estimation for soil contamination has highlighted the significance of exposure through soil ingestion in children. Increased cancer risks from groundwater and soil contamination by arsenic were noted in several municipalities and districts throughout the country in areas with significantly high arsenic levels in the environment. This approach to health risk estimations and visualization represents a fast, clear and convenient tool for delineation of risk areas at national and local levels.

  5. Using a relative health indicator (RHI) metric to estimate health risk reductions in drinking water.

    Science.gov (United States)

    Alfredo, Katherine A; Seidel, Chad; Ghosh, Amlan; Roberson, J Alan

    2017-03-01

    When a new drinking water regulation is being developed, the USEPA conducts a health risk reduction and cost analysis to, in part, estimate quantifiable and non-quantifiable cost and benefits of the various regulatory alternatives. Numerous methodologies are available for cumulative risk assessment ranging from primarily qualitative to primarily quantitative. This research developed a summary metric of relative cumulative health impacts resulting from drinking water, the relative health indicator (RHI). An intermediate level of quantification and modeling was chosen, one which retains the concept of an aggregated metric of public health impact and hence allows for comparisons to be made across "cups of water," but avoids the need for development and use of complex models that are beyond the existing state of the science. Using the USEPA Six-Year Review data and available national occurrence surveys of drinking water contaminants, the metric is used to test risk reduction as it pertains to the implementation of the arsenic and uranium maximum contaminant levels and quantify "meaningful" risk reduction. Uranium represented the threshold risk reduction against which national non-compliance risk reduction was compared for arsenic, nitrate, and radium. Arsenic non-compliance is most significant and efforts focused on bringing those non-compliant utilities into compliance with the 10 μg/L maximum contaminant level would meet the threshold for meaningful risk reduction.

  6. Risk-Cost Estimation of On-Site Wastewater Treatment System Failures Using Extreme Value Analysis.

    Science.gov (United States)

    Kohler, Laura E; Silverstein, JoAnn; Rajagopalan, Balaji

    2017-05-01

      Owner resistance to increasing regulation of on-site wastewater treatment systems (OWTS), including obligatory inspections and upgrades, moratoriums and cease-and-desist orders in communities around the U.S. demonstrate the challenges associated with managing risks of inadequate performance of owner-operated wastewater treatment systems. As a result, determining appropriate and enforceable performance measures in an industry with little history of these requirements is challenging. To better support such measures, we develop a statistical method to predict lifetime failure risks, expressed as costs, in order to identify operational factors associated with costly repairs and replacement. A binomial logistic regression is used to fit data from public records of reported OWTS failures, in Boulder County, Colorado, which has 14 300 OWTS to determine the probability that an OWTS will be in a low- or high-risk category for lifetime repair and replacement costs. High-performing or low risk OWTS with repairs and replacements below the threshold of $9000 over a 40-year life are associated with more frequent inspections and upgrades following home additions. OWTS with a high risk of exceeding the repair cost threshold of $18 000 are further analyzed in a variation of extreme value analysis (EVA), Points Over Threshold (POT) where the distribution of risk-cost exceedance values are represented by a generalized Pareto distribution. The resulting threshold cost exceedance estimates for OWTS in the high-risk category over a 40-year expected life ranged from $18 000 to $44 000.

  7. Screening effects on thyroid cancer risk estimates for populations affected by the Chernobyl accident

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, P.; Kaiser, J. C.; Vavilov, S.E.; Bogdanova, T.; Tronko, N. D.

    2004-07-01

    Simulation calculations are performed in order to explore the ecological bias in studies as they are performed with settlement specific data in the aftemath of the Chernobyl accident. Based on methods, that were developed by Lubin for exploring the ecologic bias due to smoking in indoor radon studies of lung cancer, the influence of the introduction of ultrasound devices and enhanced medical surveillance on the detection and reporting of thyroid cancer cases was investigated. Calculations were performed by simulating thyroid doses of one million children in a total of 744 settlements and assuming a linear dependence of the risk on dose and various scenarios of the screening. The dose distributions simulate the distributions similar to those used in previous ecologic studies of the thyroid cancer risk in Ukraine after the Chernobyl accident. The ecologic bias was defined as the ratio of risk coefficients derived from an ecological study to the corresponding risk factor in the underlying risl model. the ecologic bias was estimated for each of the screening scenarios. Analytical equations were derived that allow the exact numerical compuation of the bias which is determined by covariance terms between the increased detection and reporting on one side and thyroid dose values (individual and averaged for the settlements) on the other side. Nested in th epopulation data, a cohort study was simulated with 10 000 individuals and an average thyroid dose of 0.3 Gy. the present study underlines the different scopes of the ecologic and cohort study designs perfomed in the aftermed of the Chernobyl accident. Whereas the ecologic studies give an estimate of the excess thyroid cancer risks per unit dose under the conditions of a health care system as it is typical for the affected countries after the Chernobyl accident, the cohort study gives risk estimates within a well screened cohort. Due to the strong screening effects, excess absoulte risks in the ecological study cohort are

  8. The Geophysical, Anthropogenic, and Social Dimensions of Delta Risk: Estimating Contemporary and Future Risks at the Global Scale

    Science.gov (United States)

    Tessler, Z. D.; Vorosmarty, C. J.; Grossberg, M.; Gladkova, I.; Aizenman, H.; Syvitski, J. P.; Foufoula-Georgiou, E.

    2015-12-01

    Deltas are highly sensitive to increasing risks arising from local humanactivities, land subsidence, regional water management, global sea-level rise,and climate extremes. We extended a delta risk framework to include the impactof relative sea-level rise on exposure to flood conditions. We apply thisframework to an integrated set of global environmental, geophysical, and socialindicators over 48 major deltas to quantify how delta flood risk due to extremeevents is changing over time. Although geophysical and relative sea-level risederived risks are distributed across all levels of economic development, wealthycountries effectively limit their present-day threat by gross domesticproduct-enabled infrastructure and coastal defense investments. However, wheninvestments do not address the long-term drivers of land subsidence and relativesea-level rise, overall risk can be very sensitive to changes in protectivecapability. For instance, we show how in an energy-constrained future scenario,such protections will probably prove to be unsustainable, raising relative risksby four to eight times in the Mississippi and Rhine deltas and by one-and-a-halfto four times in the Chao Phraya and Yangtze deltas. The current emphasis onshort-term solutions on the world's deltas will greatly constrain options fordesigning sustainable solutions in the long term.

  9. Method of estimating mechanical stress on Si body of MOSFET using drain–body junction current

    Science.gov (United States)

    Seo, Ji-Hoon; Kim, Gang-Jun; Son, Donghee; Lee, Nam-Hyun; Kang, Bongkoo

    2017-01-01

    A simple and accurate method of estimating the mechanical stress σ on the Si body of a MOSFET is proposed. This method measures the doping concentration of the body, N d, and the onset voltage V hl for the high-level injection of the drain–body junction, uses N d, the ideality factor η, and the Fermi potential ϕf ≈ V hl/2η to calculate the intrinsic carrier concentration n i of the Si body, and then uses the calculated n i to obtain the bandgap energy E g of the Si body. σ is estimated from E g using the deformation potential theory. The estimates of σ agree well with those obtained using previous methods. The proposed method requires one MOSFET, whereas the others require at least two MOSFETs, so the proposed method can give an absolute measurement of σ on the Si body of a MOSFET.

  10. submitter Estimation of stepping motor current from long distances through cable-length-adaptive piecewise affine virtual sensor

    CERN Document Server

    Oliveri, Alberto; Masi, Alessandro; Storace, Marco

    2015-01-01

    In this paper a piecewise affine virtual sensor is used for the estimation of the motor-side current of hybrid stepper motors, which actuate the LHC (Large Hadron Collider) collimators at CERN. The estimation is performed starting from measurements of the current in the driver, which is connected to the motor by a long cable (up to 720 m). The measured current is therefore affected by noise and ringing phenomena. The proposed method does not require a model of the cable, since it is only based on measured data and can be used with cables of different length. A circuit architecture suitable for FPGA implementation has been designed and the effects of fixed point representation of data are analyzed.

  11. ESTIMATION OF INDUCED CURRENTS IN THE SHIELDS OF ELECTRICAL POWER CABLES WITH XLPE INSULATION

    Directory of Open Access Journals (Sweden)

    I. V. Oleksyuk

    2015-01-01

    Full Text Available Electrical power cables with Cross-Linked Polyethylene Insulation (XLPE-insulation are currently utilized in projects of the electric-power supply systems of modern facilities. However, the higher costs, the incomplete design, installation and maintenance normativetechnical basis as well as certain constructional features of the XLPE-insulated cable lines hinder their large-scale implementation.The cables with XLPE insulation are mostly produced in a single-conductor core version being provided with a composite copper shield whose cross-section may vary while the electric conductor cross-section remains uniform. Earthing the cable shields on both sides causes the flow of electricity in them. The course of operational service of the XLPE-insulated cable lines revealed the following fact – the currents induced in the cable shields can run up to the levels commeasurable with those in the conductor-cores themselves. That, in its turn, leads to electrical safety-level reduction, cable lines failure, and economic losses. The currents induced in the shields may occur both in symmetric (normal and emergency and asymmetric operating modes of the power grid with values of the induced currents reaching 80 % of the conducting core currents. Many factors affect the level of the current induced in the shield: the midpoint conductor modes, the values of the core longitudinal currents in the normal and emergency operating modes, failure mode, the cross-section area of the shield, the cables mutual disposition, and the distance between them.The paper claims experimental existence conformation of the cable-shield current induced by that in the conductor-core and demonstrates its measured value. The author establishes that induction of dangerous currents in the cable shields demands elaboration of measures on reducing their level.

  12. MITRA Virtual laboratory for operative application of satellite time series for land degradation risk estimation

    Science.gov (United States)

    Nole, Gabriele; Scorza, Francesco; Lanorte, Antonio; Manzi, Teresa; Lasaponara, Rosa

    2015-04-01

    This paper aims to present the development of a tool to integrate time series from active and passive satellite sensors (such as of MODIS, Vegetation, Landsat, ASTER, COSMO, Sentinel) into a virtual laboratory to support studies on landscape and archaeological landscape, investigation on environmental changes, estimation and monitoring of natural and anthropogenic risks. The virtual laboratory is composed by both data and open source tools specifically developed for the above mentioned applications. Results obtained for investigations carried out using the implemented tools for monitoring land degradation issues and subtle changes ongoing on forestry and natural areas are herein presented. In detail MODIS, SPOT Vegetation and Landsat time series were analyzed comparing results of different statistical analyses and the results integrated with ancillary data and evaluated with field survey. The comparison of the outputs we obtained for the Basilicata Region from satellite data analyses and independent data sets clearly pointed out the reliability for the diverse change analyses we performed, at the pixel level, using MODIS, SPOT Vegetation and Landsat TM data. Next steps are going to be implemented to further advance the current Virtual Laboratory tools, by extending current facilities adding new computational algorithms and applying to other geographic regions. Acknowledgement This research was performed within the framework of the project PO FESR Basilicata 2007/2013 - Progetto di cooperazione internazionale MITRA "Remote Sensing tecnologies for Natural and Cultural heritage Degradation Monitoring for Preservation and valorization" funded by Basilicata Region Reference 1. A. Lanorte, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbance International Journal of Applied Earth Observation and

  13. Using data clustering as a method of estimating the risk of establishment of bacterial crop diseases

    Directory of Open Access Journals (Sweden)

    Michael J. Watts

    2011-04-01

    Full Text Available Previous work has investigated the use of data clustering of regional species assemblages to estimate the relative risk of establishment of insect crop pest species. This paper describes the use of these techniques to estimate the risk posed by bacterial crop plant diseases. Two widely-used clustering algorithms, the Kohonen Self-Organising Map and the k-means clustering algorithm, were investigated. It describes how a wider variety of SOM architectures than previously used were investigated, and how both of these algorithms reacted to the addition of small amounts of random 'noise' to the species assemblages. The results indicate that the k-means clustering algorithm is much more computationally efficient, produces better clusters as determined by an objective measure of cluster quality and is more resistant to noise in the data than equivalent Kohonen SOM. Therefore k-means is considered to be the better algorithm for this problem.

  14. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  15. Marginal estimation for multi-stage models: waiting time distributions and competing risks analyses.

    Science.gov (United States)

    Satten, Glen A; Datta, Somnath

    2002-01-15

    We provide non-parametric estimates of the marginal cumulative distribution of stage occupation times (waiting times) and non-parametric estimates of marginal cumulative incidence function (proportion of persons who leave stage j for stage j' within time t of entering stage j) using right-censored data from a multi-stage model. We allow for stage and path dependent censoring where the censoring hazard for an individual may depend on his or her natural covariate history such as the collection of stages visited before the current stage and their occupation times. Additional external time dependent covariates that may induce dependent censoring can also be incorporated into our estimates, if available. Our approach requires modelling the censoring hazard so that an estimate of the integrated censoring hazard can be used in constructing the estimates of the waiting times distributions. For this purpose, we propose the use of an additive hazard model which results in very flexible (robust) estimates. Examples based on data from burn patients and simulated data with tracking are also provided to demonstrate the performance of our estimators.

  16. ESTIMATION OF PROCESSES REALIZATION RISK AS A MANNER OF SAFETY MANAGEMENT IN THE INTEGRATED SYSTEMS

    Directory of Open Access Journals (Sweden)

    Tatiana Karkoszka

    2011-06-01

    Full Text Available Realization of quality, environmental and occupational health and safety policy using the proposed model of processes' integrated risk estimation leads to the improvement of the analyzed productive processes by the preventive and corrective actions, and in consequence - to their optimization from the point of view of products' quality and in the aspect of quality of environmental influence and occupational health and safety.

  17. Determining treatment needs at different spatial scales using geostatistical model-based risk estimates of schistosomiasis.

    Directory of Open Access Journals (Sweden)

    Nadine Schur

    Full Text Available BACKGROUND: After many years of neglect, schistosomiasis control is going to scale. The strategy of choice is preventive chemotherapy, that is the repeated large-scale administration of praziquantel (a safe and highly efficacious drug to at-risk populations. The frequency of praziquantel administration is based on endemicity, which usually is defined by prevalence data summarized at an arbitrarily chosen administrative level. METHODOLOGY: For an ensemble of 29 West and East African countries, we determined the annualized praziquantel treatment needs for the school-aged population, adhering to World Health Organization guidelines. Different administrative levels of prevalence aggregation were considered; country, province, district, and pixel level. Previously published results on spatially explicit schistosomiasis risk in the selected countries were employed to classify each area into distinct endemicity classes that govern the frequency of praziquantel administration. PRINCIPAL FINDINGS: Estimates of infection prevalence adjusted for the school-aged population in 2010 revealed that most countries are classified as moderately endemic for schistosomiasis (prevalence 10-50%, while four countries (i.e., Ghana, Liberia, Mozambique, and Sierra Leone are highly endemic (>50%. Overall, 72.7 million annualized praziquantel treatments (50% confidence interval (CI: 68.8-100.7 million are required for the school-aged population if country-level schistosomiasis prevalence estimates are considered, and 81.5 million treatments (50% CI: 67.3-107.5 million if estimation is based on a more refined spatial scale at the provincial level. CONCLUSIONS/SIGNIFICANCE: Praziquantel treatment needs may be over- or underestimated depending on the level of spatial aggregation. The distribution of schistosomiasis in Ethiopia, Liberia, Mauritania, Uganda, and Zambia is rather uniform, and hence country-level risk estimates are sufficient to calculate treatment needs. On the

  18. Fundamentals of Semantic Web Technologies in Medical Environments: a case in breast cancer risk estimation

    CERN Document Server

    Huerga, Iker; Gerrikagoitia, Jon Kepa

    2010-01-01

    Risk estimation of developing breast cancer poses as the first prevention method for early diagnosis. Furthermore, data integration from different departments involved in the process plays a key role. In order to guarantee patient safety, the whole process should be orchestrated and monitored automatically. Support for the solution will be a linked data cloud, composed by all the departments that take part in the process, combined with rule engines.

  19. New Estimates of the Duration and Risk of Unemployment for West-Germany

    OpenAIRE

    Wilke, Ralf A.

    2004-01-01

    This paper analyzes changes in the risk of unemployment and changes in the distribution of unemployment duration for the 26 to 41 years old working population in West-Germany during the 1980ties and 1990ties. The comprehensive IAB employment subsample 1975- 1997 is used for the analysis. It contains employment and unemployment trajectories of about 500.000 individuals from West-Germany. The application of flexible nonparametric estimators yields results which are less sensitive to misspecific...

  20. Impact of neutrino properties on the estimation of inflationary parameters from current and future observations

    CERN Document Server

    Gerbino, Martina; Vagnozzi, Sunny; Lattanzi, Massimiliano; Mena, Olga; Giusarma, Elena; Ho, Shirley

    2016-01-01

    We study the impact of assumptions about neutrino properties on the estimation of inflationary parameters from cosmological data, with a specific focus on the allowed contours in the $n_s/r$ plane. We study the following neutrino properties: (i) the total neutrino mass $ M_\

  1. Estimating the risks of cancer mortality and genetic defects resulting from exposures to low levels of ionizing radiation

    Energy Technology Data Exchange (ETDEWEB)

    Buhl, T.E.; Hansen, W.R.

    1984-05-01

    Estimators for calculating the risk of cancer and genetic disorders induced by exposure to ionizing radiation have been recommended by the US National Academy of Sciences Committee on the Biological Effects of Ionizing Radiations, the UN Scientific Committee on the Effects of Atomic Radiation, and the International Committee on Radiological Protection. These groups have also considered the risks of somatic effects other than cancer. The US National Council on Radiation Protection and Measurements has discussed risk estimate procedures for radiation-induced health effects. The recommendations of these national and international advisory committees are summarized and compared in this report. Based on this review, two procedures for risk estimation are presented for use in radiological assessments performed by the US Department of Energy under the National Environmental Policy Act of 1969 (NEPA). In the first procedure, age- and sex-averaged risk estimators calculated with US average demographic statistics would be used with estimates of radiation dose to calculate the projected risk of cancer and genetic disorders that would result from the operation being reviewed under NEPA. If more site-specific risk estimators are needed, and the demographic information is available, a second procedure is described that would involve direct calculation of the risk estimators using recommended risk-rate factors. The computer program REPCAL has been written to perform this calculation and is described in this report. 25 references, 16 tables.

  2. Current and emerging strategies in the management of venous thromboembolism: benefit-risk assessment of dabigatran.

    Science.gov (United States)

    Fanola, Christina L

    2015-01-01

    Venous thromboembolism (VTE) is a disease state that carries significant morbidity and mortality, and is a known cause of preventable death in hospitalized and orthopedic surgical patients. There are many identifiable risk factors for VTE, yet up to half of VTE incident cases have no identifiable risk factor and carry a high likelihood of recurrence, which may warrant extended therapy. For many years, parenteral unfractionated heparin, low-molecular weight heparin, fondaparinux, and oral vitamin K antagonists (VKAs) have been the standard of care in VTE management. However, limitations in current drug therapy options have led to suboptimal treatment, so there has been a need for rapid-onset, fixed-dosing novel oral anticoagulants in both VTE treatment and prophylaxis. Oral VKAs have historically been challenging to use in clinical practice, with their narrow therapeutic range, unpredictable dose responsiveness, and many drug-drug and drug-food interactions. As such, there has also been a need for novel anticoagulant therapies with fewer limitations, which has recently been met. Dabigatran etexilate is a fixed-dose oral direct thrombin inhibitor available for use in acute and extended treatment of VTE, as well as prophylaxis in high-risk orthopedic surgical patients. In this review, the risks and overall benefits of dabigatran in VTE management are addressed, with special emphasis on clinical trial data and their application to general clinical practice and special patient populations. Current and emerging therapies in the management of VTE and monitoring of dabigatran anticoagulant-effect reversal are also discussed.

  3. Current Conditions Risk Assessment for the 300-FF-5 Groundwater Operable Unit

    Energy Technology Data Exchange (ETDEWEB)

    Miley, Terri B.; Bunn, Amoret L.; Napier, Bruce A.; Peterson, Robert E.; Becker, James M.

    2007-11-01

    This report updates a baseline risk assessment for the 300 Area prepared in 1994. The update includes consideration of changes in contaminants of interest and in the environment that have occurred during the period of interim remedial action, i.e., 1996 to the present, as well as the sub-regions, for which no initial risk assessments have been conducted. In 1996, a record of decision (ROD) stipulated interim remedial action for groundwater affected by releases from 300 Area sources, as follows: (a) continued monitoring of groundwater that is contaminated above health-based levels to ensure that concentrations continue to decrease, and (b) institutional controls to ensure that groundwater use is restricted to prevent unacceptable exposure to groundwater contamination. In 2000, the groundwater beneath the two outlying sub-regions was added to the operable unit. In 2001, the first 5-year review of the ROD found that the interim remedy and remedial action objectives were still appropriate, although the review called for additional characterization activities. This report includes a current conditions baseline ecological and human health risk assessment using maximum concentrations in the environmental media of the 300-FF-5 Operable Unit and downstream conditions at the City of Richland, Washington. The scope for this assessment includes only current measured environmental concentrations and current use scenarios. Future environmental concentrations and future land uses are not considered in this assessment.

  4. Health Risk Assessment of Dietary Cadmium Intake: Do Current Guidelines Indicate How Much is Safe?

    Science.gov (United States)

    Satarug, Soisungwan; Vesey, David A.; Gobe, Glenda C.

    2017-01-01

    Background: Cadmium (Cd), a food-chain contaminant, is a significant health hazard. The kidney is one of the primary sites of injury after chronic Cd exposure. Kidney-based risk assessment establishes the urinary Cd threshold at 5.24 μg/g creatinine, and tolerable dietary intake of Cd at 62 μg/day per 70-kg person. However, cohort studies show that dietary Cd intake below a threshold limit and that tolerable levels may increase the risk of death from cancer, cardiovascular disease, and Alzheimer’s disease. Objective: We evaluated if the current tolerable dietary Cd intake guideline and urinary Cd threshold limit provide sufficient health protection. Discussion: Staple foods constitute 40–60% of total dietary Cd intake by average consumers. Diets high in shellfish, crustaceans, mollusks, spinach, and offal add to dietary Cd sources. Modeling studies predict the current tolerable dietary intake corresponding to urinary Cd of 0.70–1.85 μg/g creatinine in men and 0.95–3.07 μg/g creatinine in women. Urinary Cd levels of protection from this pervasive toxic metal. Citation: Satarug S, Vesey DA, Gobe GC. 2017. Health risk assessment of dietary cadmium intake: do current guidelines indicate how much is safe? Environ Health Perspect 125:284–288; http://dx.doi.org/10.1289/EHP108 PMID:28248635

  5. [Development and Validation of Estimate Equations for Adverse Drug Reactions Using Risk Factors and Subjective Symptoms].

    Science.gov (United States)

    Suzuki, Ryohei; Ohtsu, Fumiko; Goto, Nobuyuki

    2015-01-01

      The purpose of this study was to develop and validate estimate equations for preventing adverse drug reactions (ADRs). We conducted five case-control studies to identify individual risk factors and subjective symptoms associated with the following five ADRs: drug-induced ischemic heart disease; renal damage; muscle disorder; interstitial pneumonia; and leucopenia. We performed logistic regression analysis and obtained eight regression equations for each ADR. We converted these to ADR estimate equations for predicting the likelihood of ADRs. We randomly selected 50 cases with non-individual ADRs from the Case Reports of Adverse Drug Reactions and Poisoning Information System (CARPIS) database of over 65000 case reports of ADRs, and assigned these cases to a validation case group. We then calculated the predictive probability for 50 cases using the eight estimate equations for each ADR. The highest probability for each ADR was set as the probability of each ADR. If the probability was over 50%, the case was interpreted as ADR-positive. We calculated and evaluated the sensitivity, specificity, and positive likelihood ratio of this system. Sensitivity of the estimate equations for muscle disorder and interstitial pneumonia were ≥90%. Specificity and positive likelihood ratios of estimate equations for renal damage, interstitial pneumonia and leucopenia were ≥80% and ≥5, respectively. Our estimate equations thus showed high validity, and are therefore helpful for the prevention or early detection of ADRs.

  6. Estimating ionospheric currents by inversion from ground-based geomagnetic data and calculating geoelectric fields for studies of geomagnetically induced currents

    Science.gov (United States)

    de Villiers, J. S.; Pirjola, R. J.; Cilliers, P. J.

    2016-09-01

    This research focuses on the inversion of geomagnetic variation field measurements to obtain the source currents in the ionosphere and magnetosphere, and to determine the geoelectric fields at the Earth's surface. During geomagnetic storms, the geoelectric fields create geomagnetically induced currents (GIC) in power networks. These GIC may disturb the operation of power systems, cause damage to power transformers, and even result in power blackouts. In this model, line currents running east-west along given latitudes are postulated to exist at a certain height above the Earth's surface. This physical arrangement results in the fields on the ground being composed of a zero magnetic east component and a nonzero electric east component. The line current parameters are estimated by inverting Fourier integrals (over wavenumber) of elementary geomagnetic fields using the Levenberg-Marquardt technique. The output parameters of the model are the ionospheric current strength and the geoelectric east component at the Earth's surface. A conductivity profile of the Earth is adapted from a shallow layered-Earth model for one observatory, together with a deep-layer model derived from satellite observations. This profile is used to obtain the ground surface impedance and therefore the reflection coefficient in the integrals. The inputs for the model are a spectrum of the geomagnetic data for 31 May 2013. The output parameters of the model are spectrums of the ionospheric current strength and of the surface geoelectric field. The inverse Fourier transforms of these spectra provide the time variations on the same day. The geoelectric field data can be used as a proxy for GIC in the prediction of GIC for power utilities. The current strength data can assist in the interpretation of upstream solar wind behaviour.

  7. Practical risk estimation method and applications of aerial photographs for preventing slope disasters in railways; Tetsudo ni okeru shamen saigai to sono taiosaku

    Energy Technology Data Exchange (ETDEWEB)

    Sugiyama, T.; Noguchi, T. [Railway Technical Research Institute, Tokyo (Japan)

    1997-12-01

    This paper, upon describing the actual states of slope disasters in railways, indicates the current status of measures taken to prevent these disasters. It also describes the measures to prevent slope disasters occurring currently and the future prospect thereof, focusing on the following two points: (1) a recently developed method for evaluating risks in banking and excavation slopes, and (2) an information control technology to prevent disasters of slope collapses by using remote sensing system and attempts on a method to estimate locations with risks of collapse occurrence by using the former information control technology. Development has been made of a method to evaluate rain disaster risks, which can identify quantitatively rain resisting strength of slopes, reflect the rain resisting strength on the operation rules, and estimate disasters on the real time basis. The method has been developed by using a statistic method to estimate the collapse critical rain amount as an exciting cause from predisposing cause of banking and excavation slopes, based on the record of nationwide disasters. Discussions were given on application of latest remote sensing technologies to slopes along railroads. Furthermore, an attempt was implemented on estimating collapse risks along railroad lines, utilizing slope information extracted from the above application of the remote sensing technologies. 13 refs., 12 figs., 7 tabs.

  8. Estimating risks of importation and local transmission of Zika virus infection

    Directory of Open Access Journals (Sweden)

    Kyeongah Nah

    2016-04-01

    Full Text Available Background. An international spread of Zika virus (ZIKV infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience.

  9. Estimating risks of importation and local transmission of Zika virus infection.

    Science.gov (United States)

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo; Nishiura, Hiroshi

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience.

  10. A Study on the Estimation Method of Risk Based Area for Jetty Safety Monitoring

    Directory of Open Access Journals (Sweden)

    Byeong-Wook Nam

    2015-09-01

    Full Text Available Recently, the importance of safety-monitoring systems was highlighted by the unprecedented collision between a ship and a jetty in Yeosu. Accordingly, in this study, we introduce the concept of risk based area and develop a methodology for a jetty safety-monitoring system. By calculating the risk based areas for a ship and a jetty, the risk of collision was evaluated. To calculate the risk based areas, we employed an automatic identification system for the ship, stopping-distance equations, and the regulation velocity near the jetty. In this paper, we suggest a risk calculation method for jetty safety monitoring that can determine the collision probability in real time and predict collisions using the amount of overlap between the two calculated risk based areas. A test was conducted at a jetty control center at GS Caltex, and the effectiveness of the proposed risk calculation method was verified. The method is currently applied to the jetty-monitoring system at GS Caltex in Yeosu for the prevention of collisions.

  11. Cancer risk estimates from radiation therapy for heterotopic ossification prophylaxis after total hip arthroplasty

    Energy Technology Data Exchange (ETDEWEB)

    Mazonakis, Michalis; Berris, Theoharris; Damilakis, John [Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, 71003 Iraklion, Crete (Greece); Lyraraki, Efrossyni [Department of Radiotherapy and Oncology, University Hospital of Iraklion, 71110 Iraklion, Crete (Greece)

    2013-10-15

    Purpose: Heterotopic ossification (HO) is a frequent complication following total hip arthroplasty. This study was conducted to calculate the radiation dose to organs-at-risk and estimate the probability of cancer induction from radiotherapy for HO prophylaxis.Methods: Hip irradiation for HO with a 6 MV photon beam was simulated with the aid of a Monte Carlo model. A realistic humanoid phantom representing an average adult patient was implemented in Monte Carlo environment for dosimetric calculations. The average out-of-field radiation dose to stomach, liver, lung, prostate, bladder, thyroid, breast, uterus, and ovary was calculated. The organ-equivalent-dose to colon, that was partly included within the treatment field, was also determined. Organ dose calculations were carried out using three different field sizes. The dependence of organ doses upon the block insertion into primary beam for shielding colon and prosthesis was investigated. The lifetime attributable risk for cancer development was estimated using organ, age, and gender-specific risk coefficients.Results: For a typical target dose of 7 Gy, organ doses varied from 1.0 to 741.1 mGy by the field dimensions and organ location relative to the field edge. Blocked field irradiations resulted in a dose range of 1.4–146.3 mGy. The most probable detriment from open field treatment of male patients was colon cancer with a high risk of 564.3 × 10{sup −5} to 837.4 × 10{sup −5} depending upon the organ dose magnitude and the patient's age. The corresponding colon cancer risk for female patients was (372.2–541.0) × 10{sup −5}. The probability of bladder cancer development was more than 113.7 × 10{sup −5} and 110.3 × 10{sup −5} for males and females, respectively. The cancer risk range to other individual organs was reduced to (0.003–68.5) × 10{sup −5}.Conclusions: The risk for cancer induction from radiation therapy for HO prophylaxis after total hip arthroplasty varies considerably by

  12. Genetic risk and longitudinal disease activity in systemic lupus erythematosus using targeted maximum likelihood estimation.

    Science.gov (United States)

    Gianfrancesco, M A; Balzer, L; Taylor, K E; Trupin, L; Nititham, J; Seldin, M F; Singer, A W; Criswell, L A; Barcellos, L F

    2016-09-01

    Systemic lupus erythematous (SLE) is a chronic autoimmune disease associated with genetic and environmental risk factors. However, the extent to which genetic risk is causally associated with disease activity is unknown. We utilized longitudinal-targeted maximum likelihood estimation to estimate the causal association between a genetic risk score (GRS) comprising 41 established SLE variants and clinically important disease activity as measured by the validated Systemic Lupus Activity Questionnaire (SLAQ) in a multiethnic cohort of 942 individuals with SLE. We did not find evidence of a clinically important SLAQ score difference (>4.0) for individuals with a high GRS compared with those with a low GRS across nine time points after controlling for sex, ancestry, renal status, dialysis, disease duration, treatment, depression, smoking and education, as well as time-dependent confounding of missing visits. Individual single-nucleotide polymorphism (SNP) analyses revealed that 12 of the 41 variants were significantly associated with clinically relevant changes in SLAQ scores across time points eight and nine after controlling for multiple testing. Results based on sophisticated causal modeling of longitudinal data in a large patient cohort suggest that individual SLE risk variants may influence disease activity over time. Our findings also emphasize a role for other biological or environmental factors.

  13. Evidence that Risk Adjustment is Unnecessary in Estimates of the User Cost of Money

    Directory of Open Access Journals (Sweden)

    Diego A. Restrepo-Tobón

    2015-12-01

    Full Text Available Investors value the  special attributes of monetary assets (e.g.,  exchangeability, liquidity, and safety  and pay a premium for holding them in the form of a lower return rate. The user cost of holding monetary assets can be measured approximately by the difference between the  returns on illiquid risky assets and  those of safer liquid assets. A more appropriate measure should adjust this difference by the  differential risk of the  assets in question. We investigate the  impact that time  non-separable preferences has on the  estimation of the  risk-adjusted user cost of money. Using U.K. data from 1965Q1 to 2011Q1, we estimate a habit-based asset pricing model  with money  in the utility function and  find that the  risk  adjustment for risky monetary assets is negligible. Thus, researchers can dispense with risk adjusting the  user cost of money  in constructing monetary aggregate indexes.

  14. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea.

    Science.gov (United States)

    Moon, Eun-Kyeong; Ha, Wi-Ho; Seo, Songwon; Jin, Young Woo; Jeong, Kyu Hwan; Yoon, Hae-Jung; Kim, Hyoung-Soo; Hwang, Myung-Sil; Choi, Hoon; Lee, Won Jin

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of (134)Cs, (137)Cs, and (131)I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively.

  15. Injury Risk Estimation Expertise: Cognitive-Perceptual Mechanisms of ACL-IQ.

    Science.gov (United States)

    Petushek, Erich J; Cokely, Edward T; Ward, Paul; Myer, Gregory D

    2015-06-01

    Instrument-based biomechanical movement analysis is an effective injury screening method but relies on expensive equipment and time-consuming analysis. Screening methods that rely on visual inspection and perceptual skill for prognosticating injury risk provide an alternative approach that can significantly reduce cost and time. However, substantial individual differences exist in skill when estimating injury risk performance via observation. The underlying perceptual-cognitive mechanisms of injury risk identification were explored to better understand the nature of this skill and provide a foundation for improving performance. Quantitative structural and process modeling of risk estimation indicated that superior performance was largely mediated by specific strategies and skills (e.g., irrelevant information reduction), and independent of domain-general cognitive abilities (e.g., mental rotation, general decision skill). These cognitive models suggest that injury prediction expertise (i.e., ACL-IQ) is a trainable skill, and provide a foundation for future research and applications in training, decision support, and ultimately clinical screening investigations.

  16. Relative risk estimation for malaria disease mapping based on stochastic SIR-SI model in Malaysia

    Science.gov (United States)

    Samat, Nor Azah; Ma'arof, Syafiqah Husna Mohd Imam

    2016-10-01

    Disease mapping is a study on the geographical distribution of a disease to represent the epidemiology data spatially. The production of maps is important to identify areas that deserve closer scrutiny or more attention. In this study, a mosquito-borne disease called Malaria is the focus of our application. Malaria disease is caused by parasites of the genus Plasmodium and is transmitted to people through the bites of infected female Anopheles mosquitoes. Precautionary steps need to be considered in order to avoid the malaria virus from spreading around the world, especially in the tropical and subtropical countries, which would subsequently increase the number of Malaria cases. Thus, the purpose of this paper is to discuss a stochastic model employed to estimate the relative risk of malaria disease in Malaysia. The outcomes of the analysis include a Malaria risk map for all 16 states in Malaysia, revealing the high and low risk areas of Malaria occurrences.

  17. Risk and size estimation of debris flow caused by storm rainfall in mountain regions

    Institute of Scientific and Technical Information of China (English)

    CHENG Genwei

    2003-01-01

    Debris flow is a common disaster in mountain regions. The valley slope, storm rainfall and amassed sand-rock materials in a watershed may influence the types of debris flow. The bursting of debris flow is not a pure random event. Field investigations show the periodicity of its burst, but no directive evidence has been found yet. A risk definition of debris flow is proposed here based upon the accumulation and the starting conditions of loose material in channel. According to this definition, the risk of debris flow is of quasi-periodicity. A formula of risk estimation is derived. Analysis of relative factors reveals the relationship between frequency and size of debris flow. For a debris flow creek, the longer the time interval between two occurrences of debris flows is, the bigger the bursting event will be.

  18. Performance of models for estimating absolute risk difference in multicenter trials with binary outcome

    Directory of Open Access Journals (Sweden)

    Claudia Pedroza

    2016-08-01

    Full Text Available Abstract Background Reporting of absolute risk difference (RD is recommended for clinical and epidemiological prospective studies. In analyses of multicenter studies, adjustment for center is necessary when randomization is stratified by center or when there is large variation in patients outcomes across centers. While regression methods are used to estimate RD adjusted for baseline predictors and clustering, no formal evaluation of their performance has been previously conducted. Methods We performed a simulation study to evaluate 6 regression methods fitted under a generalized estimating equation framework: binomial identity, Poisson identity, Normal identity, log binomial, log Poisson, and logistic regression model. We compared the model estimates to unadjusted estimates. We varied the true response function (identity or log, number of subjects per center, true risk difference, control outcome rate, effect of baseline predictor, and intracenter correlation. We compared the models in terms of convergence, absolute bias and coverage of 95 % confidence intervals for RD. Results The 6 models performed very similar to each other for the majority of scenarios. However, the log binomial model did not converge for a large portion of the scenarios including a baseline predictor. In scenarios with outcome rate close to the parameter boundary, the binomial and Poisson identity models had the best performance, but differences from other models were negligible. The unadjusted method introduced little bias to the RD estimates, but its coverage was larger than the nominal value in some scenarios with an identity response. Under the log response, coverage from the unadjusted method was well below the nominal value (<80 % for some scenarios. Conclusions We recommend the use of a binomial or Poisson GEE model with identity link to estimate RD for correlated binary outcome data. If these models fail to run, then either a logistic regression, log Poisson

  19. Clinical Risk Assessment in the Antiphospholipid Syndrome: Current Landscape and Emerging Biomarkers.

    Science.gov (United States)

    Chaturvedi, Shruti; McCrae, Keith R

    2017-07-01

    Laboratory criteria for the classification of antiphospholipid syndrome include the detection of a lupus anticoagulant and/or anticardiolipin and anti-β2-glycoprotein I antibodies. However, the majority of patients who test positive in these assays do not have thrombosis. Current risk-stratification tools are largely limited to the antiphospholipid antibody profile and traditional thrombotic risk factors. Novel biomarkers that correlate with disease activity and potentially provide insight into future clinical events include domain 1 specific anti-β2GPI antibodies, antibodies to other phospholipids or phospholipid/protein antigens (such as anti-PS/PT), and functional/biological assays such as thrombin generation, complement activation, levels of circulating microparticles, and annexin A5 resistance. Clinical risk scores may also have value in predicting clinical events. Biomarkers that predict thrombosis risk in patients with antiphospholipid antibodies have been long sought, and several biomarkers have been proposed. Ultimately, integration of biomarkers with established assays and clinical characteristics may offer the best chance of identifying patients at highest risk of APS-related complications.

  20. Transcranial direct current stimulation over prefrontal cortex diminishes degree of risk aversion.

    Science.gov (United States)

    Ye, Hang; Chen, Shu; Huang, Daqiang; Wang, Siqi; Jia, Yongmin; Luo, Jun

    2015-06-26

    Previous studies have established that transcranial direct current stimulation (tDCS) is a powerful technique for manipulating the activity of the human cerebral cortex. Many studies have found that weighing the risks and benefits in decision-making involves a complex neural network that includes the dorsolateral prefrontal cortex (DLPFC). We studied whether participants change the balance of risky and safe responses after receiving tDCS applied over the right and left prefrontal cortex. A total of 60 healthy volunteers performed a risk task while they received either anodal tDCS over the right prefrontal cortex, with cathodal over the left; anodal tDCS over the left prefrontal cortex, with cathodal over the right; or sham stimulation. The participants tended to choose less risky options after receiving sham stimulation, demonstrating that the task might be highly influenced by the "wealth effect". There was no statistically significant change after either right anodal/left cathodal or left anodal/right cathodal tDCS, indicating that both types of tDCS impact the participants' degrees of risk aversion, and therefore, counteract the wealth effect. We also found gender differences in the participants' choices. These findings extend the notion that DLPFC activity is critical for risk decision-making. Application of tDCS to the right/left DLPFC may impact a person's attitude to taking risks.

  1. See Project for Testing Gravity in Space Current Status and New Estimates

    CERN Document Server

    Alexeev, A D; Kolosnitsyn, N I; Konstantinov, M Yu; Melnikov, V N; Sanders, A J

    1999-01-01

    We describe some new estimates concerning the recently proposed SEE(Satellite Energy Exchange) experiment for measuring the gravitationalinteraction parameters in space. The experiment entails precision tracking ofthe relative motion of two test bodies (a heavy "Shepherd", and a light"Particle") on board a drag-free capsule. The new estimates include (i) thesensitivity of Particle trajectories and G measurement to the Shepherdquadrupole moment uncertainties; (ii) the measurement errors of G and thestrength of a putative Yukawa-type force whose range parameter $\\lambda$ may beeither of the order of a few metres or close to the Earth radius; (iii) apossible effect of the Van Allen radiation belts on the SEE experiment due totest body electric charging

  2. The issues of current rainfall estimation techniques in mountain natural multi-hazard investigation

    Science.gov (United States)

    Zhuo, Lu; Han, Dawei; Chen, Ningsheng; Wang, Tao

    2017-04-01

    Mountain hazards (e.g., landslides, debris flows, and floods) induced by rainfall are complex phenomena that require good knowledge of rainfall representation at different spatiotemporal scales. This study reveals rainfall estimation from gauges is rather unrepresentative over a large spatial area in mountain regions. As a result, the conventional practice of adopting the triggering threshold for hazard early warning purposes is insufficient. The main reason is because of the huge orographic influence on rainfall distribution. Modern rainfall estimation methods such as numerical weather prediction modelling and remote sensing utilising radar from the space or on land are able to provide spatially more representative rainfall information in mountain areas. But unlike rain gauges, they only indirectly provide rainfall measurements. Remote sensing suffers from many sources of errors such as weather conditions, attenuation and sampling methods, while numerical weather prediction models suffer from spatiotemporal and amplitude errors depending on the model physics, dynamics, and model configuration. A case study based on Sichuan, China is used to illustrate the significant difference among the three aforementioned rainfall estimation methods. We argue none of those methods can be relied on individually, and the challenge is on how to make the full utilisation of the three methods conjunctively because each of them only provides partial information. We propose that a data fusion approach should be adopted based on the Bayesian inference method. However such an approach requires the uncertainty information from all those estimation techniques which still need extensive research. We hope this study will raise the awareness of this important issue and highlight the knowledge gap that should be filled in so that such a challenging problem could be tackled collectively by the community.

  3. FACTORS OF RESISTANCE AGAINST AIR CURRENT ESTABLISHED BY MEASUREMENT AND FACTOR ESTIMATED BY LITERATURE

    Directory of Open Access Journals (Sweden)

    Vladimir Rendulić

    1996-12-01

    Full Text Available Specific resistance (R100 for underground mine rooms can he calculated by the constant (C. Constant (C enables defining of the resistance for any relation of dimensions F3/U in similar mine rooms. Considerable anomalies were found by comparison of the resistance factor calculated from measurement data with estimated value of the factor from specialized literature on planning similar ventilation structures (the paper is published in Croatian.

  4. ESTIMATION OF CURRENT AND PERSPECTIVE ACCUMULATION LEVELOF NATIONAL HUMAN CAPITAL IN RUSSIA

    Directory of Open Access Journals (Sweden)

    Баранова, Н.М.

    2016-10-01

    Full Text Available The article analyzes the main indicators of the knowledge economy in some countries, determining the economic condition of the state. In the "Concept 2020" long-term Russian development a large role human capital as a key factor of long-term sustainable economic growth. Examining the relationship between level of education and income of the individual obtained in the course of life. Examining the main engines of development of human capital and providing estimates of its accumulation.

  5. Risk estimation with epidemiologic data when response attenuates at high-exposure levels.

    Science.gov (United States)

    Steenland, Kyle; Seals, Ryan; Klein, Mitch; Jinot, Jennifer; Kahn, Henry D

    2011-06-01

    In occupational studies, which are commonly used for risk assessment for environmental settings, estimated exposure-response relationships often attenuate at high exposures. Relative risk (RR) models with transformed (e.g., log- or square root-transformed) exposures can provide a good fit to such data, but resulting exposure-response curves that are supralinear in the low-dose region may overestimate low-dose risks. Conversely, a model of untransformed (linear) exposure may underestimate risks attributable to exposures in the low-dose region. We examined several models, seeking simple parametric models that fit attenuating exposure-response data well. We have illustrated the use of both log-linear and linear RR models using cohort study data on breast cancer and exposure to ethylene oxide. Linear RR models fit the data better than do corresponding log-linear models. Among linear RR models, linear (untransformed), log-transformed, square root-transformed, linear-exponential, and two-piece linear exposure models all fit the data reasonably well. However, the slopes of the predicted exposure-response relations were very different in the low-exposure range, which resulted in different estimates of the exposure concentration associated with a 1% lifetime excess risk (0.0400, 0.00005, 0.0016, 0.0113, and 0.0100 ppm, respectively). The linear (in exposure) model underestimated the categorical exposure-response in the low-dose region, whereas log-transformed and square root-transformed exposure models overestimated it. Although a number of models may fit attenuating data well, models that assume linear or nearly linear exposure-response relations in the low-dose region of interest may be preferred by risk assessors, because they do not depend on the choice of a point of departure for linear low-dose extrapolation and are relatively easy to interpret.

  6. Current asthma control predicts future risk of asthma exacerbation: a 12-month prospective cohort study

    Institute of Scientific and Technical Information of China (English)

    WEI Hua-hua; ZHOU Ting; WANG Lan; ZHANG Hong-ping; FU Juan-juan; WANG Lei; JI Yu-lin; WANG Gang

    2012-01-01

    Background The performance of asthma control test (ACT) at baseline for predicting future risk of asthma exacerbation has not been previously demonstrated.This study was designed to explore the ability of the baseline ACT score to predict future risk of asthma exacerbation during a 12-month follow-up.Methods This post hoc analysis included data from a 12-month prospective cohort study in patients with asthma (n=290).The time to the first asthma exacerbation was analyzed and the association between baseline ACT scores and future risk of asthma exacerbation was calculated as adjusted odds ratio (OR) using Logistic regression models.Further,sensitivity and specificity were estimated at each cut-point of ACT scores for predicting asthma exacerbations.Results The subjects were divided into three groups,which were uncontrolled (U,n=128),partly-controlled (PC,n=111),and well controlled (C,n=51) asthma.After adjustment,the decreased ACT scores at baseline in the U and PC groups were associated with an increased probability of asthma exacerbations (OR 3.65 and OR 5.75,respectively),unplanned visits (OR 8.03 and OR 8.21,respectively) and emergency visits (OR 20.00 and OR 22.60,respectively) over a 12-month follow-up period.The time to the first asthma exacerbation was shorter in the groups with U and PC asthma (all P<0.05).The baseline ACT of 20 identified as the cut-point for screening the patients at high risk of asthma exacerbations had an increased sensitivity of over 90.0% but a lower specificity of about 30.0%.Conclusion Our findings indicate that the baseline ACT score with a high sensitivity could rule out patients at low risk of asthma exacerbations and oredict future risk of asthma exacerbations in clinical practice.

  7. The Prevalence of Age-Related Eye Diseases and Visual Impairment in Aging: Current Estimates

    Science.gov (United States)

    Klein, Ronald; Klein, Barbara E. K.

    2013-01-01

    Purpose. To examine prevalence of five age-related eye conditions (age-related cataract, AMD, open-angle glaucoma, diabetic retinopathy [DR], and visual impairment) in the United States. Methods. Review of published scientific articles and unpublished research findings. Results. Cataract, AMD, open-angle glaucoma, DR, and visual impairment prevalences are high in four different studies of these conditions, especially in people over 75 years of age. There are disparities among racial/ethnic groups with higher age-specific prevalence of DR, open-angle glaucoma, and visual impairment in Hispanics and blacks compared with whites, higher prevalence of age-related cataract in whites compared with blacks, and higher prevalence of late AMD in whites compared with Hispanics and blacks. The estimates are based on old data and do not reflect recent changes in the distribution of age and race/ethnicity in the United States population. There are no epidemiologic estimates of prevalence for many visually-impairing conditions. Conclusions. Ongoing prevalence surveys designed to provide reliable estimates of visual impairment, AMD, age-related cataract, open-angle glaucoma, and DR are needed. It is important to collect objective data on these and other conditions that affect vision and quality of life in order to plan for health care needs and identify areas for further research. PMID:24335069

  8. Using climate model simulations to assess the current climate risk to maize production

    Science.gov (United States)

    Kent, Chris; Pope, Edward; Thompson, Vikki; Lewis, Kirsty; Scaife, Adam A.; Dunstone, Nick

    2017-05-01

    The relationship between the climate and agricultural production is of considerable importance to global food security. However, there has been relatively little exploration of climate-variability related yield shocks. The short observational yield record does not adequately sample natural inter-annual variability thereby limiting the accuracy of probability assessments. Focusing on the United States and China, we present an innovative use of initialised ensemble climate simulations and a new agro-climatic indicator, to calculate the risk of severe water stress. Combined, these regions provide 60% of the world’s maize, and therefore, are crucial to global food security. To probe a greater range of inter-annual variability, the indicator is applied to 1400 simulations of the present day climate. The probability of severe water stress in the major maize producing regions is quantified, and in many regions an increased risk is found compared to calculations from observed historical data. Analysis suggests that the present day climate is also capable of producing unprecedented severe water stress conditions. Therefore, adaptation plans and policies based solely on observed events from the recent past may considerably under-estimate the true risk of climate-related maize shocks. The probability of a major impact event occurring simultaneously across both regions—a multi-breadbasket failure—is estimated to be up to 6% per decade and arises from a physically plausible climate state. This novel approach highlights the significance of climate impacts on crop production shocks and provides a platform for considerably improving food security assessments, in the present day or under a changing climate, as well as development of new risk based climate services.

  9. Estimating past hepatitis C infection risk from reported risk factor histories: implications for imputing age of infection and modeling fibrosis progression

    Directory of Open Access Journals (Sweden)

    Busch Michael P

    2007-12-01

    Full Text Available Abstract Background Chronic hepatitis C virus infection is prevalent and often causes hepatic fibrosis, which can progress to cirrhosis and cause liver cancer or liver failure. Study of fibrosis progression often relies on imputing the time of infection, often as the reported age of first injection drug use. We sought to examine the accuracy of such imputation and implications for modeling factors that influence progression rates. Methods We analyzed cross-sectional data on hepatitis C antibody status and reported risk factor histories from two large studies, the Women's Interagency HIV Study and the Urban Health Study, using modern survival analysis methods for current status data to model past infection risk year by year. We compared fitted distributions of past infection risk to reported age of first injection drug use. Results Although injection drug use appeared to be a very strong risk factor, models for both studies showed that many subjects had considerable probability of having been infected substantially before or after their reported age of first injection drug use. Persons reporting younger age of first injection drug use were more likely to have been infected after, and persons reporting older age of first injection drug use were more likely to have been infected before. Conclusion In cross-sectional studies of fibrosis progression where date of HCV infection is estimated from risk factor histories, modern methods such as multiple imputation should be used to account for the substantial uncertainty about when infection occurred. The models presented here can provide the inputs needed by such methods. Using reported age of first injection drug use as the time of infection in studies of fibrosis progression is likely to produce a spuriously strong association of younger age of infection with slower rate of progression.

  10. Temperature estimation in a ferromagnetic Fe-Ni nanowire involving a current-driven domain wall motion.

    Science.gov (United States)

    Yamaguchi, A; Hirohata, A; Ono, T; Miyajima, H

    2012-01-18

    We observed a magnetic domain wall (DW) motion induced by the spin-polarized pulsed current in a nanoscale Fe(19)Ni(81) wire using a magnetic force microscope. High current density, which is of the order of 10(11) A m(-2), was required for the DW motion. A simple method to estimate the temperature of the wire was developed by comparing the wire resistance measured during the DW motion with the temperature dependence of the wire resistance. Using this method, we found the temperature of the wire was proportional to the square of the current density and became just beneath at the threshold Curie temperature. Our experimental data qualitatively support this analytical model that the temperature is proportional to the resistivity, thickness, width of the wire and the square of the current density, and also inversely proportional to the thermal conductivity.

  11. Estimating risk factors of urban malaria in Blantyre, Malawi:A spatial regression analysis

    Institute of Scientific and Technical Information of China (English)

    Lawrence N Kazembe; Don P Mathanga

    2016-01-01

    Objective: To estimate risk factors of urban malaria in Blantyre, Malawi, with the goal of understanding the epidemiology and ecology of the disease, and informing malaria elimination policies for African urban cities that have markedly low prevalence of malaria. Methods: We used a case-control study design, with cases being children under the age of five years diagnosed with malaria, and matched controls obtained at hospital and communities. The data were obtained from Ndirande health facility catchment area. We then fitted a multivariate spatial logistic model of malaria risk. Covariate and risk factors in the model included child-specific, household and environmental risk factor (nearness to garden, standing water, river and swamps). The spatial component was assumed to follow a Gaussian process and model fitted using Bayesian inference. Results: Our findings showed that children who visited rural areas were 6 times more likely to have malaria than those who did not [odds ratio (OR)=6.66, 95%confidence interval (CI):4.79–9.61]. The risk of malaria increased with age of the child (OR=1.01, 95%CI:1.003–1.020), but reduced with high socio-economic status compared to lower status (OR=0.39, 95%CI:0.25–0.54 for the highest level and OR=0.67, 95%CI:0.47–0.94 for the medium level). Although nearness to a garden, river and standing water showed increased risk, these effects were not significant. Furthermore, significant spatial clusters of risk emerged, which does suggest other factors do explain malaria risk vari-ability apart from those established above. Conclusions: As malaria in urban areas is highly fuelled by rural-urban migration, emphasis should be to optimize information, education and communication prevention strategies, particularly targeting children from lower socio-economic position.

  12. Estimating risk factors of urban malaria in Blantyre, Malawi: A spatial regression analysis

    Institute of Scientific and Technical Information of China (English)

    Lawrence N.Kazembe; Don P.Mathanga

    2016-01-01

    Objective: To estimate risk factors of urban malaria in Blantyre, Malawi, with the goal of understanding the epidemiology and ecology of the disease, and informing malaria elimination policies for African urban cities that have markedly low prevalence of malaria.Methods: We used a case-control study design, with cases being children under the age of five years diagnosed with malaria, and matched controls obtained at hospital and communities. The data were obtained from Ndirande health facility catchment area. We then fitted a multivariate spatial logistic model of malaria risk. Covariate and risk factors in the model included child-specific, household and environmental risk factor(nearness to garden, standing water, river and swamps). The spatial component was assumed to follow a Gaussian process and model fitted using Bayesian inference.Results: Our findings showed that children who visited rural areas were 6 times more likely to have malaria than those who did not [odds ratio(OR) = 6.66, 95% confidence interval(CI): 4.79–9.61]. The risk of malaria increased with age of the child(OR = 1.01,95% CI: 1.003–1.020), but reduced with high socio-economic status compared to lower status(OR = 0.39, 95% CI: 0.25–0.54 for the highest level and OR = 0.67, 95% CI: 0.47–0.94 for the medium level). Although nearness to a garden, river and standing water showed increased risk, these effects were not significant. Furthermore, significant spatial clusters of risk emerged, which does suggest other factors do explain malaria risk variability apart from those established above.Conclusions: As malaria in urban areas is highly fuelled by rural-urban migration,emphasis should be to optimize information, education and communication prevention strategies, particularly targeting children from lower socio-economic position.

  13. Local land-use change based risk estimation for future glacier lake outburst flood

    Directory of Open Access Journals (Sweden)

    S. Nussbaumer

    2013-08-01

    Full Text Available Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing risk of landslides which can potentially impact lakes and therewith trigger far reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available. It is actually a challenge to develop methods to evaluate both, future hazard potential and future damage potential. Here we present an analysis of future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais. To estimate two hazard scenarios, we used glacier shrinkage and lake formation modelling, simple flood modelling and field work. Further we developed a land-use model to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for a time period of 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study corroborates the importance of land-use planning to effectively reduce future risks related to lake outburst floods.

  14. Local land-use change based risk estimation for future glacier lake outburst flood

    Science.gov (United States)

    Nussbaumer, S.; Huggel, C.; Schaub, Y.; Walz, A.

    2013-08-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing risk of landslides which can potentially impact lakes and therewith trigger far reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available. It is actually a challenge to develop methods to evaluate both, future hazard potential and future damage potential. Here we present an analysis of future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To estimate two hazard scenarios, we used glacier shrinkage and lake formation modelling, simple flood modelling and field work. Further we developed a land-use model to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for a time period of 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study corroborates the importance of land-use planning to effectively reduce future risks related to lake outburst floods.

  15. Risk estimation for future glacier lake outburst floods based on local land-use changes

    Science.gov (United States)

    Nussbaumer, S.; Schaub, Y.; Huggel, C.; Walz, A.

    2014-06-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing landslide hazard which can potentially impact lakes and therewith trigger far-reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available and need to be developed to evaluate both future hazard and future damage potential. Here a method is presented to estimate future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To generate two hazard scenarios, glacier shrinkage and lake formation modelling was applied, combined with simple flood modelling and field work. Furthermore, a land-use model was developed to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for the year 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study underlines the importance of combined risk management strategies focusing on land-use planning, on vulnerability reduction, as well as on structural measures (where necessary) to effectively reduce future risks related to lake outburst floods.

  16. Colorectal cancer screening in high-risk groups is increasing, although current smokers fall behind.

    Science.gov (United States)

    Oluyemi, Aminat O; Welch, Amy R; Yoo, Lisa J; Lehman, Erik B; McGarrity, Thomas J; Chuang, Cynthia H

    2014-07-15

    There is limited information about colorectal cancer (CRC) screening trends in high-risk groups, including the black, obese, diabetic, and smoking populations. For this study, the authors evaluated national CRC screening trends in these high-risk groups to provide insights into whether screening resources are being appropriately used. This was a nationally representative, population-based study using the Behavioral Risk Factor Surveillance System from the Centers for Disease Control. Data analysis was performed using bivariate analyses with weighted logistic regression. In the general population, CRC screening increased significantly from 59% to 65% during the years 2006 to 2010. The screening prevalence in non-Hispanic blacks was 58% in 2006 and 65% in 2010. Among obese individuals, the prevalence of up-to-date CRC screening increased significantly from 59% in 2006 to 66% in 2010. Screening prevalence in individuals with diabetes was 63% in 2006 and 69% in 2010. The CRC screening prevalence in current smokers was 45% in 2006 and 50% in 2010. The odds of CRC screening in the non-Hispanic black population, the obese population, and the diabetic population were higher than in non-Hispanic whites, normal weight individuals, and the population without diabetes, respectively. Current smokers had significantly lower odds of CRC screening than never-smokers in the years studied (2006: odds ratio [OR], 0.71; 95% confidence interval [CI], 0.66-0.76; 2008: OR, 0.67; 95% CI, 0.63-0.71; 2010: OR, 0.69; 95% CI, 0.66-0.73). The prevalence of CRC screening in high-risk groups is trending upward. Despite this, current smokers have significantly lower odds of CRC screening compared with the general population. © 2014 American Cancer Society.

  17. Estimating the budget impact of new technologies added to the National List of Health Services in Israel: stakeholders' incentives for adopting a financial risk-sharing mechanism.

    Science.gov (United States)

    Hammerman, Ariel; Greenberg, Dan

    2009-01-01

    The Israeli National List of Health Services (NLHS) is updated annually according to a government allocated budget. The estimated annual cost of each new technology added to this list is based on budget-impact estimations provided by the HMOs and the manufacturers. The HMOs argue that once a new technology is reimbursed, extensive marketing efforts by industry expands demand and renders the allocated budget insufficient. Industry claims that HMOs, in order to secure a sufficient budget, tend to over-estimate the number of target patients. We provide a framework for a financial risk-sharing mechanism between HMOs and the industry, which may be able to balance these incentives and result in more accurate early budget-impact estimates. To explore the current stakeholders' incentives and behaviors under the existing process of updating the NLHS, and to examine the possible incentives for adopting a financial risk-sharing mechanism on early budget-impact estimations. According to the financial risk-sharing mechanism, HMOs will be partially compensated by the industry if actual use of a technology is substantially higher than what was projected. HMOs will partially refund the government for a budget that was not fully used. To maintain profits, we assume that the industry will present a more realistic budget-impact analysis. HMOs will be less apprehensive of technology promotion, as they would be compensated in case of budget under-estimation. In case of over-estimation of technology use, the budget re-allocated will be used to enlarge the NLHS which is in the best interest of the health technology industry. Our proposed risk-sharing mechanism is expected to counter balance incentives and disincentives that currently exist in adopting new health technologies in the Israeli healthcare system.

  18. Are current UK National Institute for Health and Clinical Excellence (NICE) obesity risk guidelines useful? Cross-sectional associations with cardiovascular disease risk factors in a large, representative English population.

    Science.gov (United States)

    Tabassum, Faiza; Batty, G David

    2013-01-01

    The National Institute for Health and Clinical Excellence (NICE) has recently released obesity guidelines for health risk. For the first time in the UK, we estimate the utility of these guidelines by relating them to the established cardiovascular disease (CVD) risk factors. Health Survey for England (HSE) 2006, a population-based cross-sectional study in England was used with a sample size of 7225 men and women aged ≥35 years (age range: 35-97 years). The following CVD risk factor outcomes were used: hypertension, diabetes, total and high density lipoprotein cholesterol, glycated haemoglobin, fibrinogen, C-reactive protein and Framingham risk score. Four NICE categories of obesity were created based on body mass index (BMI) and waist circumference (WC): no risk (up to normal BMI and low/high WC); increased risk (normal BMI & very high WC, or obese & low WC); high risk (overweight & very high WC, or obese & high WC); and very high risk (obese I & very high WC or obese II/III with any levels of WC. Men and women in the very high risk category had the highest odds ratios (OR) of having unfavourable CVD risk factors compared to those in the no risk category. For example, the OR of having hypertension for those in the very high risk category of the NICE obesity groupings was 2.57 (95% confidence interval 2.06 to 3.21) in men, and 2.15 (1.75 to 2.64) in women. Moreover, a dose-response association between the adiposity groups and most of the CVD risk factors was observed except total cholesterol in men and low HDL in women. Similar results were apparent when the Framingham risk score was the outcome of interest. In conclusion, the current NICE definitions of obesity show utility for a range of CVD risk factors and CVD risk in both men and women.

  19. Biokinetic and dosimetric modelling in the estimation of radiation risks from internal emitters.

    Science.gov (United States)

    Harrison, John

    2009-06-01

    The International Commission on Radiological Protection (ICRP) has developed biokinetic and dosimetric models that enable the calculation of organ and tissue doses for a wide range of radionuclides. These are used to calculate equivalent and effective dose coefficients (dose in Sv Bq(-1) intake), considering occupational and environmental exposures. Dose coefficients have also been given for a range of radiopharmaceuticals used in diagnostic medicine. Using equivalent and effective dose, exposures from external sources and from different radionuclides can be summed for comparison with dose limits, constraints and reference levels that relate to risks from whole-body radiation exposure. Risk estimates are derived largely from follow-up studies of the survivors of the atomic bombings at Hiroshima and Nagasaki in 1945. New dose coefficients will be required following the publication in 2007 of new ICRP recommendations. ICRP biokinetic and dosimetric models are subject to continuing review and improvement, although it is arguable that the degree of sophistication of some of the most recent models is greater than required for the calculation of effective dose to a reference person for the purposes of regulatory control. However, the models are also used in the calculation of best estimates of doses and risks to individuals, in epidemiological studies and to determine probability of cancer causation. Models are then adjusted to best fit the characteristics of the individuals and population under consideration. For example, doses resulting from massive discharges of strontium-90 and other radionuclides to the Techa River from the Russian Mayak plutonium plant in the early years of its operation are being estimated using models adapted to take account of measurements on local residents and other population-specific data. Best estimates of doses to haemopoietic bone marrow, in utero and postnatally, are being used in epidemiological studies of radiation-induced leukaemia

  20. On the estimation of population-specific synaptic currents from laminar multielectrode recordings

    Directory of Open Access Journals (Sweden)

    Sergey L Gratiy

    2011-12-01

    Full Text Available Multielectrode array recordings of extracellular electrical field potentials along the depth axis of the cerebral cortex is an up-and-coming approach for investigating activity of cortical neuronal circuits. The low-frequency band of extracellular potential, i.e., the local field potential (LFP, is assumed to reflect the synaptic activity and can be used to extract the current source density (CSD profile. However, physiological interpretation of CSD profiles is uncertain because the analysis does not disambiguate synaptic inputs from passive return currents. Here we present a novel mathematical framework for identifying excited neuronal populations and for separating synaptic input currents from return currents based on LFP recordings. This involves a combination of the linear forward model, which predicts population-specific laminar LFP in response to sinusoidal synaptic inputs applied at different locations along the population cells having realistic morphologies and the linear inverse model, which reconstructs laminar profiles of synaptic inputs from the Fourier spectrum of the laminar LFP data based on the forward prediction. The model allows reconstruction of synaptic input profiles on a spatial scale comparable to known anatomical organization of synaptic projections within a cortical column. Assuming spatial correlation of synaptic inputs within individual populations, the model decomposes the columnar LFP into population-specific contributions. Constraining the solution with a priori knowledge of the spatial distribution of synaptic connectivity further allows prediction of active projections from the composite LFP profile. This modeling framework successfully delineates the main relationships between the synaptic input currents and the evoked LFP and can serve as a foundation for modeling more realistic processing of active dendritic conductances.

  1. Avian mortalities due to transmission line collisions: a review of current estimates and field methods with an emphasis on applications to the Canadian electric network

    Directory of Open Access Journals (Sweden)

    Sébastien Rioux

    2013-12-01

    Full Text Available Birds are vulnerable to collisions with human-made fixed structures. Despite ongoing development and increases in infrastructure, we have few estimates of the magnitude of collision mortality. We reviewed the existing literature on avian mortality associated with transmission lines and derived an initial estimate for Canada. Estimating mortality from collisions with power lines is challenging due to the lack of studies, especially from sites within Canada, and due to uncertainty about the magnitude of detection biases. Detection of bird collisions with transmission lines varies due to habitat type, species size, and scavenging rates. In addition, birds can be crippled by the impact and subsequently die, although crippling rates are poorly known and rarely incorporated into estimates. We used existing data to derive a range of estimates of avian mortality associated with collisions with transmission lines in Canada by incorporating detection, scavenging, and crippling biases. There are 231,966 km of transmission lines across Canada, mostly in the boreal forest. Mortality estimates ranged from 1 million to 229.5 million birds per year, depending on the bias corrections applied. We consider our most realistic estimate, taking into account variation in risk across Canada, to range from 2.5 million to 25.6 million birds killed per year. Data from multiple studies across Canada and the northern U.S. indicate that the most vulnerable bird groups are (1 waterfowl, (2 grebes, (3 shorebirds, and (4 cranes, which is consistent with other studies. Populations of several groups that are vulnerable to collisions are increasing across Canada (e.g., waterfowl, raptors, which suggests that collision mortality, at current levels, is not limiting population growth. However, there may be impacts on other declining species, such as shorebirds and some species at risk, including Alberta's Trumpeter Swans (Cygnus buccinator and western Canada's endangered Whooping

  2. Evaluation of an algorithm for estimating a patient's life threat risk from an ambulance call

    Directory of Open Access Journals (Sweden)

    Moriwaki Yoshihiro

    2009-10-01

    Full Text Available Abstract Background Utilizing a computer algorithm, information from calls to an ambulance service was used to calculate the risk of patients being in a life-threatening condition (life threat risk, at the time of the call. If the estimated life threat risk was higher than 10%, the probability that a patient faced a risk of dying was recognized as very high and categorized as category A+. The present study aimed to review the accuracy of the algorithm. Methods Data collected for six months from the Yokohama new emergency system was used. In the system, emergency call workers interviewed ambulance callers to obtain information necessary to assess triage, which included consciousness level, breathing status, walking ability, position, and complexion. An emergency patient's life threat risk was then estimated by a computer algorithm applying logistic models. This study compared the estimated life threat risk occurring at the time of the emergency call to the patients' state or severity of condition, i.e. death confirmed at the scene by ambulance crews, resulted in death at emergency departments, life-threatening condition with occurrence of cardiac and/or pulmonary arrest (CPA, life-threatening condition without CPA, serious but not life-threatening condition, moderate condition, and mild condition. The sensitivity, specificity, predictive values, and likelihood ratios of the algorithm for categorizing A+ were calculated. Results The number of emergency dispatches over the six months was 73,992. Triage assessment was conducted for 68,692 of these calls. The study targets account for 88.8% of patients who were involved in triage calls. There were 2,349 cases where the patient had died or had suffered CPA. The sensitivity, specificity, positive predictive value, negative predictive value, positive likelihood ratio and negative likelihood ratio of the algorithm at predicting cases that would result in a death or CPA were 80.2% (95% confidence interval

  3. The relevance of the early history of probability theory to current risk assessment practices in mental health care.

    Science.gov (United States)

    Large, Matthew

    2013-12-01

    Probability theory is at the base of modern concepts of risk assessment in mental health. The aim of the current paper is to review the key developments in the early history of probability theory in order to enrich our understanding of current risk assessment practices.

  4. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-08-09

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... period. These two proposals are related to the proposed rule ``Current Good Manufacturing Practice...

  5. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    Science.gov (United States)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  6. Estimating Turbulent Surface Fluxes from Small Unmanned Aircraft: Evaluation of Current Abilities

    Science.gov (United States)

    de Boer, G.; Lawrence, D.; Elston, J.; Cassano, J. J.; Mack, J.; Wildmann, N.; Nigro, M. A.; Ivey, M.; Wolfe, D. E.; Muschinski, A.

    2014-12-01

    Heat transfer between the atmosphere and Earth's surface represents a key component to understanding Earth energy balance, making it important in understanding and simulating climate. Arguably, the oceanic air-sea interface and Polar sea-ice-air interface are amongst the most challenging in which to measure these fluxes. This difficulty results partially from challenges associated with infrastructure deployment on these surfaces and partially from an inability to obtain spatially representative values over a potentially inhomogeneous surface. Traditionally sensible (temperature) and latent (moisture) fluxes are estimated using one of several techniques. A preferred method involves eddy-correlation where cross-correlation between anomalies in vertical motion (w) and temperature (T) or moisture (q) is used to estimate heat transfer. High-frequency measurements of these quantities can be derived using tower-mounted instrumentation. Such systems have historically been deployed over land surfaces or on ships and buoys to calculate fluxes at the air-land or air-sea interface, but such deployments are expensive and challenging to execute, resulting in a lack of spatially diverse measurements. A second ("bulk") technique involves the observation of horizontal windspeed, temperature and moisture at a given altitude over an extended time period in order to estimate the surface fluxes. Small Unmanned Aircraft Systems (sUAS) represent a unique platform from which to derive these fluxes. These sUAS can be small ( 1 m), lightweight ( 700 g), low cost ( $2000) and relatively easy to deploy to remote locations and over inhomogeneous surfaces. We will give an overview of the ability of sUAS to provide measurements necessary for estimating surface turbulent fluxes. This discussion is based on flights in the vicinity of the 1000 ft. Boulder Atmospheric Observatory (BAO) tower, and over the US Department of Energy facility at Oliktok Point, Alaska. We will present initial comparisons

  7. A Study on Maneuvering Obstacle Motion State Estimation for Intelligent Vehicle Using Adaptive Kalman Filter Based on Current Statistical Model

    Directory of Open Access Journals (Sweden)

    Bao Han

    2015-01-01

    Full Text Available The obstacle motion state estimation is an essential task in intelligent vehicle. The ASCL group has developed such a system that uses a radar and GPS/INS. When running on the road, the acceleration of the vehicle is always changing, so it is hard for constant velocity (CV model and constant acceleration (CA model to describe the motion state of the vehicle. This paper introduced Current Statistical (CS model from military field, which uses the modified Rayleigh distribution to describe acceleration. The adaptive Kalman filter based on CS model was used to estimate the motion state of the target. We conducted simulation experiments and real vehicle tests, and the results showed that the estimation of position, velocity, and acceleration can be precise.

  8. Estimation of Esfarayen Farmers Risk Aversion Coefficient and Its Influencing Factors (Nonparametric Approach

    Directory of Open Access Journals (Sweden)

    Z. Nematollahi

    2016-03-01

    Full Text Available Introduction: Due to existence of the risk and uncertainty in agriculture, risk management is crucial for management in agriculture. Therefore the present study was designed to determine the risk aversion coefficient for Esfarayens farmers. Materials and Methods: The following approaches have been utilized to assess risk attitudes: (1 direct elicitation of utility functions, (2 experimental procedures in which individuals are presented with hypothetical questionnaires regarding risky alternatives with or without real payments and (3: Inference from observation of economic behavior. In this paper, we focused on approach (3: inference from observation of economic behavior, based on this assumption of existence of the relationship between the actual behavior of a decision maker and the behavior predicted from empirically specified models. A new non-parametric method and the QP method were used to calculate the coefficient of risk aversion. We maximized the decision maker expected utility with the E-V formulation (Freund, 1956. Ideally, in constructing a QP model, the variance-covariance matrix should be formed for each individual farmer. For this purpose, a sample of 100 farmers was selected using random sampling and their data about 14 products of years 2008- 2012 were assembled. The lowlands of Esfarayen were used since within this area, production possibilities are rather homogeneous. Results and Discussion: The results of this study showed that there was low correlation between some of the activities, which implies opportunities for income stabilization through diversification. With respect to transitory income, Ra, vary from 0.000006 to 0.000361 and the absolute coefficient of risk aversion in our sample were 0.00005. The estimated Ra values vary considerably from farm to farm. The results showed that the estimated Ra for the subsample existing of 'non-wealthy' farmers was 0.00010. The subsample with farmers in the 'wealthy' group had an

  9. Epidemiology of Hepatitis C Infection in Pakistan: Current Estimate and Major Risk Factors.

    Science.gov (United States)

    Arshad, Aiman; Ashfaq, Usman Ali

    2017-01-01

    In Pakistan, hepatitis C virus (HCV) is a major healthcare problem, with acute and chronic infections responsible for liver damage, cirrhosis, and hepatocellular carcinoma. Under the Human Development Index of the United Nations, Pakistan is ranked 134th of 174 countries due to its poor educational and health standards. This study was designed to study HCV and its genotype prevalence in different cities and provinces of Pakistan and describe the major routes of HCV transmission. Literature searches were performed in PubMed, Mendeley, and Google Scholar. Ninety different studies were screened for this review, ranging from those published during the years 2000 to 2013. By calculating the mean average of all studies, it was clear that HCV percentage prevalence in the adult population was 11.55%, blood donors 10.10%, pregnant women 4.65%, children 1.6%, patients with different diseases 24.97%, and injecting drug users had the highest prevalence at 51.0%. HCV genotype 3a prevalence was found to be 63.45%, the highest of all genotypes. The percentage prevalence of HCV found for all of the provinces was Punjab: 5.46%, Sindh: 2.55%, Khyber Pakhtoonkhaw: 6.07%, Balochistan: 25.77%, and federally administrated tribal areas: 3.37%. This study shows that the overall prevalence of HCV in the provinces of Pakistan is 8.64% and suggests that the major routes of HCV transmission are reuse of syringes and needles and unchecked blood transfusions. Awareness and economic growth are required to help decrease HCV infection and improve health standards in Pakistan.

  10. Quantitative Estimation of Risks for Production Unit Based on OSHMS and Process Resilience

    Science.gov (United States)

    Nyambayar, D.; Koshijima, I.; Eguchi, H.

    2017-06-01

    Three principal elements in the production field of chemical/petrochemical industry are (i) Production Units, (ii) Production Plant Personnel and (iii) Production Support System (computer system introduced for improving productivity). Each principal element has production process resilience, i.e. a capability to restrain disruptive signals occurred in and out of the production field. In each principal element, risk assessment is indispensable for the production field. In a production facility, the occupational safety and health management system (Hereafter, referred to as OSHMS) has been introduced to reduce a risk of accidents and troubles that may occur during production. In OSHMS, a risk assessment is specified to reduce a potential risk in the production facility such as a factory, and PDCA activities are required for a continual improvement of safety production environments. However, there is no clear statement to adopt the OSHMS standard into the production field. This study introduces a metric to estimate the resilience of the production field by using the resilience generated by the production plant personnel and the result of the risk assessment in the production field. A method for evaluating how OSHMS functions are systematically installed in the production field is also discussed based on the resilience of the three principal elements.

  11. Estimation of Credit Risk for Business Firms of Nationalized Bank by Neural Network Approach

    Directory of Open Access Journals (Sweden)

    Ms. A. R. Ghatge

    2013-06-01

    Full Text Available Financial credit risk assessment has gained a great deal of attention. Many different parties have an interest in credit risk assessment. Banking authorities are interested because it helps them to determine the overall strength of the banking system and its ability to handle adverse conditions. Due to the importance of credit risk analysis, many methods were widely applied to credit risk measurement tasks, from that Artificial Neural Network plays an important role for analyzing the credit default problem. Artificial neural networks represent an easily customizable tool for modeling learning behavior of agents and for studying a lot of problems very difficult to analyze with standard economic models ANN has many advantages over conventional methods of analysis. According to Shachmurove (2002, they have the ability to analyze complex patterns quickly and with a high degree of accuracy.The focus of this paper is to determine that a neural network is a suitable modelling technique for predicting the business firm loan is satisfactory or not. This paper shows that an ANN approach will classify the applicant as a default or not and predict a credit default allowance amount more closely aligned with the credit default expenseincurred during the fiscal period than traditional management approaches to estimating the allowance. The results show that credit risk evaluation using Back propagation neural network and expert evaluation have the very good consistency

  12. Comparison of methods for estimating the attributable risk in the context of survival analysis

    Directory of Open Access Journals (Sweden)

    Malamine Gassama

    2017-01-01

    Full Text Available Abstract Background The attributable risk (AR measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Methods Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier’s estimator, one semiparametric based on Cox’s model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox’s model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Results Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. Conclusion In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of

  13. Estimating suspended solids concentrations from backscatter intensity measured by acoustic Doppler current profiler in San Francisco Bay, California

    Science.gov (United States)

    Gartner, J.W.

    2004-01-01

    The estimation of mass concentration of suspended solids is one of the properties needed to understand the characteristics of sediment transport in bays and estuaries. However, useful measurements or estimates of this property are often problematic when employing the usual methods of determination from collected water samples or optical sensors. Analysis of water samples tends to undersample the highly variable character of suspended solids, and optical sensors often become useless from biological fouling in highly productive regions. Acoustic sensors, such as acoustic Doppler current profilers that are now routinely used to measure water velocity, have been shown to hold promise as a means of quantitatively estimating suspended solids from acoustic backscatter intensity, a parameter used in velocity measurement. To further evaluate application of this technique using commercially available instruments, profiles of suspended solids concentrations are estimated from acoustic backscatter intensity recorded by 1200- and 2400-kHz broadband acoustic Doppler current profilers located at two sites in San Francisco Bay, California. ADCP backscatter intensity is calibrated using optical backscatterance data from an instrument located at a depth close to the ADCP transducers. In addition to losses from spherical spreading and water absorption, calculations of acoustic transmission losses account for attenuation from suspended sediment and correction for nonspherical spreading in the near field of the acoustic transducer. Acoustic estimates of suspended solids consisting of cohesive and noncohesive sediments are found to agree within about 8-10% (of the total range of concentration) to those values estimated by a second optical backscatterance sensor located at a depth further from the ADCP transducers. The success of this approach using commercially available Doppler profilers provides promise that this technique might be appropriate and useful under certain conditions in

  14. A method for state of energy estimation of lithium-ion batteries at dynamic currents and temperatures

    Science.gov (United States)

    Liu, Xingtao; Wu, Ji; Zhang, Chenbin; Chen, Zonghai

    2014-12-01

    The state of energy (SOE) of Li-ion batteries is a critical index for energy optimization and management. In the applied battery system, the fact that the discharge current and the temperature change due to the dynamic load will result in errors in the estimation of the residual energy for the battery. To address this issue, a new method based on the Back-Propagation Neural Network (BPNN) is presented for the SOE estimation. In the proposed approach, in order to take into account the energy loss on the internal resistance, the electrochemical reactions and the decrease of the open-circuit voltage (OCV), the SOE is introduced to replace the state of charge (SOC) to describe the residual energy of the battery. Additionally, the discharge current and temperature are taken as the training inputs of the BPNN to overcome their interference on the SOE estimation. The simulation experiments on LiFePO4 batteries indicate that the proposed method based on the BPNN can estimate the SOE much more reliably and accurately.

  15. Discriminating hand gesture motor imagery tasks using cortical current density estimation.

    Science.gov (United States)

    Edelman, Bradley; Baxter, Bryan; He, Bin

    2014-01-01

    Current EEG based brain computer interface (BCI) systems have achieved successful control in up to 3 dimensions; however the current paradigm may be unnatural for many rehabilitative and recreational applications. Therefore there is a great need to find motor imagination (MI) tasks that are realistic for output device control. In this paper we present our results on classifying hand gesture MI tasks, including right hand flexion, extension, supination and pronation using a novel EEG inverse imaging approach. By using both temporal and spatial specificity in the source domain we were able to separate MI tasks with up to 95% accuracy for binary classification of any two tasks compared to a maximum of only 79% in the sensor domain.

  16. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  17. Revised estimates of the risk of fetal loss following a prenatal diagnosis of trisomy 13 or trisomy 18.

    Science.gov (United States)

    Cavadino, Alana; Morris, Joan K

    2017-04-01

    Edwards syndrome (trisomy 18) and Patau syndrome (trisomy 13) both have high natural fetal loss rates. The aim of this study was to provide estimates of these fetal loss rates by single gestational week of age using data from the National Down Syndrome Cytogenetic Register. Data from all pregnancies with Edwards or Patau syndrome that were prenatally detected in England and Wales from 2004 to 2014 was analyzed using Kaplan-Meier survival estimates. Pregnancies were entered into the analysis at the time of gestation at diagnosis, and were considered "under observation" until the gestation at outcome. There were 4088 prenatal diagnoses of trisomy 18 and 1471 of trisomy 13 in the analysis. For trisomy 18, 30% (95%CI: 25-34%) of viable fetuses at 12 weeks will result in a live birth and at 39 weeks gestation 67% (60-73%) will result in a live birth. For trisomy 13 the survival is 50% (41-58%) at 12 weeks and 84% (73-90%) at 39 weeks. There was no significant difference in survival between males and females when diagnosed at 12 weeks for trisomy 18 (P-value = 0.27) or trisomy 13 (P-value = 0.47). This paper provides the most precise gestational age-specific estimates currently available for the risk of fetal loss in trisomy 13 and trisomy 18 pregnancies in a general population. © 2017 Wiley Periodicals, Inc.

  18. Impact of ground motion characterization on conservatism and variability in seismic risk estimates

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, R.T.; Toro, G.R.; McGuire, R.K.

    1996-07-01

    This study evaluates the impact, on estimates of seismic risk and its uncertainty, of alternative methods in treatment and characterization of earthquake ground motions. The objective of this study is to delineate specific procedures and characterizations that may lead to less biased and more precise seismic risk results. This report focuses on sources of conservatism and variability in risk that may be introduced through the analytical processes and ground-motion descriptions which are commonly implemented at the interface of seismic hazard and fragility assessments. In particular, implication of the common practice of using a single, composite spectral shape to characterize motions of different magnitudes is investigated. Also, the impact of parameterization of ground motion on fragility and hazard assessments is shown. Examination of these results demonstrates the following. (1) There exists significant conservatism in the review spectra (usually, spectra characteristic of western U.S. earthquakes) that have been used in conducting past seismic risk assessments and seismic margin assessments for eastern U.S. nuclear power plants. (2) There is a strong dependence of seismic fragility on earthquake magnitude when PGA is used as the ground-motion characterization. When, however, magnitude-dependent spectra are anchored to a common measure of elastic spectral acceleration averaged over the appropriate frequency range, seismic fragility shows no important nor consistent dependence on either magnitude or strong-motion duration. Use of inelastic spectral acceleration (at the proper frequency) as the ground spectrum anchor demonstrates a very similar result. This study concludes that a single, composite-magnitude spectrum can generally be used to characterize ground motion for fragility assessment without introducing significant bias or uncertainty in seismic risk estimates.

  19. Mercury risk from fluorescent lamps in China: current status and future perspective.

    Science.gov (United States)

    Hu, Yuanan; Cheng, Hefa

    2012-09-01

    Energy-efficient lighting is one of the key measures for addressing electric power shortages and climate change mitigation, and fluorescent lamps are expected to dominate the lighting market in China over the next several years. This review presents an overview on the emissions and risk of mercury from fluorescent lamps during production and disposal, and discusses measures for reducing the mercury risk through solid waste management and source reduction. Fluorescent lamps produced in China used to contain relatively large amounts of mercury (up to 40 mg per lamp) due to the prevalence of liquid mercury dosing, which also released significant amounts of mercury to the environment. Upgrade of the mercury dosing technologies and manufacturing facilities had significantly reduced the mercury contents in fluorescent lamps, with most of them containing less than 10 or 5mg per lamp now. Occupational hygiene studies showed that mercury emissions occurred during fluorescent lamp production, particularly in the facilities using liquid mercury dosing, which polluted the environmental media at and surrounding the production sites and posed chronic health risk to the workers by causing neuropsychological and motor impairments. It is estimated that spent fluorescent lamps account for approximately 20% of mercury input in the MSW in China. Even though recycling of fluorescent lamps presents an important opportunity to capture the mercury they contain, it is difficult and not cost-effective at reducing the mercury risk under the broader context of mercury pollution control in China. In light of the significant mercury emissions associated with electricity generation in China, we propose that reduction of mercury emissions and risk associated with fluorescent lamps should be achieved primarily through lowering their mercury contents by the manufacturers while recycling programs should focus on elemental mercury-containing waste products instead of fluorescent lamps to recapture

  20. Exposure of unionid mussels to electric current: Assessing risks associated with electrofishing

    Science.gov (United States)

    Holliman, F.M.; Kwak, T.J.; Cope, W.G.; Levine, J.F.

    2007-01-01

    Electric current is routinely applied in freshwater for scientific sampling of fish populations (i.e., electrofishing). Freshwater mussels (families Margaritiferidae and Unionidae) are distributed worldwide, but their recent declines in diversity and abundance constitute an imperilment of global significance. Freshwater mussels are not targeted for capture by electrofishing, and any exposure to electric current is unintentional. The effects of electric shock are not fully understood for mussels but could disrupt vital physiological processes and represent an additional threat to their survival. In a controlled laboratory environment, we examined the consequences of exposure to two typical electrofishing currents, 60-Hz pulsed DC and 60-Hz AC, for the survival of adult and early life stages of three unionid species; we included fish as a quality control measure. The outcomes suggest that electrical exposure associated with typical electrofishing poses little direct risk to freshwater mussels. That is, adult mussel survival and righting behaviors (indicators of sublethal stress) were not adversely affected by electrical exposure. Glochidia (larvae that attach to and become parasites on fish gills or fins) showed minimal immediate reduction in viability after exposure. Metamorphosis from glochidia to free-living juvenile mussels was not impaired after electric current simulated capture-prone behaviors (stunning) in infested host fish. In addition, the short-term survival of juvenile mussels was not adversely influenced by exposure to electric current. Any minimal risk to imperiled mussels must be weighed at the population level against the benefits gained by using the gear for scientific sampling of fish in the same waters. However, scientists sampling fish by electrofishing should be aware of mussel reproductive periods and processes in order to minimize the harmful effects to host fish, especially in areas where mussel conservation is a concern. ?? Copyright by the

  1. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk.

    Science.gov (United States)

    Walsh, Kyle M; Codd, Veryan; Rice, Terri; Nelson, Christopher P; Smirnov, Ivan V; McCoy, Lucie S; Hansen, Helen M; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S; Madsen, Nils R; Bracci, Paige M; Pico, Alexander R; Molinaro, Annette M; Tihan, Tarik; Berger, Mitchel S; Chang, Susan M; Prados, Michael D; Jenkins, Robert B; Wiemels, Joseph L; Samani, Nilesh J; Wiencke, John K; Wrensch, Margaret R

    2015-12-15

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82x10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48x10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83x10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis.

  2. Potential Risk Estimation Drowning Index for Children (PREDIC): a pilot study from Matlab, Bangladesh.

    Science.gov (United States)

    Borse, N N; Hyder, A A; Bishai, D; Baker, T; Arifeen, S E

    2011-11-01

    Childhood drowning is a major public health problem that has been neglected in many low- and middle-income countries. In Matlab, rural Bangladesh, more than 40% of child deaths aged 1-4 years are due to drowning. The main objective of this paper was to develop and evaluate a childhood drowning risk prediction index. A literature review was carried out to document risk factors identified for childhood drowning in Bangladesh. The Newacheck model for special health care needs for children was adapted and applied to construct a childhood drowning risk index called "Potential Risk Estimation Drowning Index for Children" (PREDIC). Finally, the proposed PREDIC Index was applied to childhood drowning deaths and compared with the comparison group from children living in Matlab, Bangladesh. This pilot study used t-tests and Receiver Operating Characteristic (ROC) curve to analyze the results. The PREDIC index was applied to 302 drowning deaths and 624 children 0-4 years old living in Matlab. The results of t-test indicate that the drowned children had a statistically (t=-8.58, p=0.0001) significant higher mean PREDIC score (6.01) than those in comparison group (5.26). Drowning cases had a PREDIC score of 6 or more for 68% of the children however, the comparison group had 43% of the children with score of 6 or more which was statistically significant (t=-7.36, p<0.001). The area under the curve for the Receiver Operating Characteristic curve was 0.662. Index score construction was scientifically plausible; and the index is relatively complete, fairly accurate, and practical. The risk index can help identify and target high risk children with drowning prevention programs. PREDIC index needs to be further tested for its accuracy, feasibility and effectiveness in drowning risk reduction in Bangladesh and other countries. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Geostatistical analysis of disease data: estimation of cancer mortality risk from empirical frequencies using Poisson kriging

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2005-12-01

    Full Text Available Abstract Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight, the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the

  4. Physical activity estimated by the bone-specific physical activity questionnaire is also associated with cardiovascular risk.

    Science.gov (United States)

    Weeks, Benjamin K; Purvis, Meredith; Beck, Belinda R

    2016-11-01

    The nature of physical activity that benefits bone is traditionally thought to differ from that benefiting cardiovascular health. Accordingly, exercise recommendations for improving bone health and cardiovascular health are largely incongruent. Our aim was to determine the associations between high-impact physical activity participation and both cardiovascular disease risk factors and bone mass. We recruited 94 men and women (age 34.0 ± 13.3 years) to undergo measures of cardiovascular disease risk (BMI, total cholesterol, fasting blood glucose, waist-to-hip ratio, and mean arterial pressure) and dual-energy X-ray absorptiometry (DXA XR-800, Norland) measures of bone mass (femoral neck, lumbar spine, and whole body BMD) and body composition (whole body lean mass and fat mass). Physical activity participation was estimated using the bone-specific physical activity questionnaire (BPAQ). Those in the upper tertile for current BPAQ score exhibited lower total cholesterol, waist-to-hip ratio, and mean arterial pressure than those in the lower tertiles (P high-impact physical activity as captured by the BPAQ may be beneficial for both bone health and for attenuating cardiovascular disease risk.

  5. Smoke exposure as a risk factor for asthma in childhood: a review of current evidence.

    Science.gov (United States)

    Ferrante, Giuliana; Antona, Roberta; Malizia, Velia; Montalbano, Laura; Corsello, Giovanni; La Grutta, Stefania

    2014-01-01

    Asthma is a common chronic multifactorial disease that affects >300 million people worldwide. Outdoor and indoor pollution exposure has been associated with respiratory health effects in adults and children. Smoking still represents a huge public health problem and millions of children suffer the detrimental effects of passive smoke exposure. This study was designed to review the current evidences on exposure to passive smoke as a risk factor for asthma onset in childhood. A review of the most recent studies on this topic was undertaken to provide evidence about the magnitude of the effect of passive smoking on the risk of incidence of asthma in children. The effects of passive smoking are different depending on individual and environmental factors. Environmental tobacco smoke (ETS) is one of the most important indoor air pollutants and can interact with other air pollutants in eliciting respiratory outcomes during childhood. The increased risk of respiratory outcomes in children exposed to prenatal and early postnatal passive smoke might be caused by an adverse effect on both the immune system and the structural and functional development of the lung; this may explain the subsequent increased risk of incident asthma. The magnitude of the exposure is quite difficult to precisely quantify because it is significantly influenced by the child's daily activities. Because exposure to ETS is a likely cause for asthma onset in childhood, there is a strong need to prevent infants and children from breathing air contaminated with tobacco smoke.

  6. Yesterday's Japan: A system of flood risk estimation over Japan with remote-sensing precipitation data

    Science.gov (United States)

    Kanae, S.; Seto, S.; Yoshimura, K.; Oki, T.

    2008-12-01

    A new river discharge prediction and hindcast system all over Japan in order to issue alerts of flood risks has been developed. It utilizes Japan Meteorological Agency"fs Meso-scale model outputs and remote-sensing precipitation data. A statistical approach that compromises the bias and uncertainty of models is proposed for interpreting the simulated river discharge as a flood risk. A 29-year simulation was implemented to estimate parameters of the Gumbel distribution for the probability of extreme discharge, and the estimated discharge probability index (DPI) showed good agreement with that estimated based on observations. Even more strikingly, high DPI in the simulation corresponded to actual flood damage records. This indicates that the real-time simulation of the DPI could potentially provide reasonable flood warnings. A method to overcome the lack of sufficiently long simulation data through the use of a pre-existing long-term simulation and to estimate statistical parameters is also proposed. A preliminary flood risk prediction that used operational weather forecast data for 2003 and 2004 gave results similar to those of the 29-year simulation for the Typhoon T0423 event on October 2004, demonstrating the transferability of the technique to real-time prediction. In addition, the usefulness of satellite precipitation data for the flood estimation is evaluated via hindcast. We conducted it using several precipitation satellite datasets. The GSMaP product can detect heavy precipitation events, but floods being not well simulated in many cases because of GSMAP"fs underestimation. The GSMaP product adjusted by using monthly and 1 degree rain gauge information can be used to detect flood events as well as hourly rain gauge observations. Another quantitative issue is also disscussed. When a remote-sensing based precipitation data is used as an input for hindcast, we are suffering from underestimation of precipitation amount. The effort for improvement will be shown

  7. Risk Consideration and Cost Estimation in Construction Projects Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Claudius A. Peleskei

    2015-06-01

    Full Text Available Construction projects usually involve high investments. It is, therefore, a risky adventure for companies as actual costs of construction projects nearly always exceed the planed scenario. This is due to the various risks and the large uncertainty existing within this industry. Determination and quantification of risks and their impact on project costs within the construction industry is described to be one of the most difficult areas. This paper analyses how the cost of construction projects can be estimated using Monte Carlo Simulation. It investigates if the different cost elements in a construction project follow a specific probability distribution. The research examines the effect of correlation between different project costs on the result of the Monte Carlo Simulation. The paper finds out that Monte Carlo Simulation can be a helpful tool for risk managers and can be used for cost estimation of construction projects. The research has shown that cost distributions are positively skewed and cost elements seem to have some interdependent relationships.

  8. Portfolio Value at Risk Estimate for Crude Oil Markets: A Multivariate Wavelet Denoising Approach

    Directory of Open Access Journals (Sweden)

    Kin Keung Lai

    2012-04-01

    Full Text Available In the increasingly globalized economy these days, the major crude oil markets worldwide are seeing higher level of integration, which results in higher level of dependency and transmission of risks among different markets. Thus the risk of the typical multi-asset crude oil portfolio is influenced by dynamic correlation among different assets, which has both normal and transient behaviors. This paper proposes a novel multivariate wavelet denoising based approach for estimating Portfolio Value at Risk (PVaR. The multivariate wavelet analysis is introduced to analyze the multi-scale behaviors of the correlation among different markets and the portfolio volatility behavior in the higher dimensional time scale domain. The heterogeneous data and noise behavior are addressed in the proposed multi-scale denoising based PVaR estimation algorithm, which also incorporatesthe mainstream time series to address other well known data features such as autocorrelation and volatility clustering. Empirical studies suggest that the proposed algorithm outperforms the benchmark ExponentialWeighted Moving Average (EWMA and DCC-GARCH model, in terms of conventional performance evaluation criteria for the model reliability.

  9. Prospect theory based estimation of drivers' risk attitudes in route choice behaviors.

    Science.gov (United States)

    Zhou, Lizhen; Zhong, Shiquan; Ma, Shoufeng; Jia, Ning

    2014-12-01

    This paper applied prospect theory (PT) to describe drivers' route choice behavior under Variable Message Sign (VMS), which presented visual traffic information to assist them to make route choice decisions. A quite rich empirical data from questionnaire and field spot was used to estimate parameters of PT. In order to make the parameters more realistic with drivers' attitudes, they were classified into different types by significant factors influencing their behaviors. Based on the travel time distribution of alternative routes and route choice results from questionnaire, the parameterized value function of each category was figured out, which represented drivers' risk attitudes and choice characteristics. The empirical verification showed that the estimates were acceptable and effective. The result showed drivers' risk attitudes and route choice characteristics could be captured by PT under real-time information shown on VMS. For practical application, once drivers' route choice characteristics and parameters were identified, their route choice behavior under different road conditions could be predicted accurately, which was the basis of traffic guidance measures formulation and implementation for targeted traffic management. Moreover, the heterogeneous risk attitudes among drivers should be considered when releasing traffic information and regulating traffic flow.

  10. Breast cancer size estimation with MRI in BRCA mutation carriers and other high risk patients

    Energy Technology Data Exchange (ETDEWEB)

    Mann, R.M., E-mail: r.mann@rad.umcn.nl [Radboud University Nijmegen Medical Centre, Department of Radiology, Nijmegen (Netherlands); Bult, P., E-mail: p.bult@path.umcn.nl [Radboud University Nijmegen Medical Centre, Department of Pathology, Nijmegen (Netherlands); Laarhoven, H.W.M. van, E-mail: h.vanlaarhoven@amc.uva.nl [Academic Medical Centre, University of Amsterdam, Department of Medical Oncology, Amsterdam (Netherlands); Radboud University Nijmegen Medical Centre, Department of Medical Oncology, Nijmegen (Netherlands); Span, P.N., E-mail: p.span@rther.umcn.nl [Radboud University Nijmegen Medical Centre, Department of Radiation Oncology, Nijmegen (Netherlands); Schlooz, M., E-mail: m.schlooz@chir.umcn.nl [Radboud University Nijmegen Medical Centre, Department of Surgery, Nijmegen (Netherlands); Veltman, J., E-mail: j.veltman@zgt.nl [Hospital group Twente (ZGT), Department of Radiology, Almelo (Netherlands); Hoogerbrugge, N., E-mail: n.hoogerbrugge@gen.umcn.nl [Radboud University Nijmegen Medical Centre, Department of Human Genetics, Nijmegen (Netherlands)

    2013-09-15

    Objective: To assess the value of breast MRI in size assessment of breast cancers in high risk patients, including those with a BRCA 1 or 2 mutation. Guidelines recommend invariably breast MRI screening for these patients and therapy is thus based on these findings. However, the accuracy of breast MRI for staging purposes is only tested in sporadic cancers. Methods: We assessed concordance of radiologic staging using MRI with histopathology in 49 tumors in 46 high risk patients (23 BRCA1, 12 BRCA2 and 11 Non-BRCA patients). The size of the total tumor area (TTA) was compared to pathology. In invasive carcinomas (n = 45) the size of the largest focus (LF) was also addressed. Results: Correlation of MRI measurements with pathology was 0.862 for TTA and 0.793 for LF. TTA was underestimated in 8(16%), overestimated in 5(10%), and correctly measured in 36(73%) cases. LF was underestimated in 4(9%), overestimated in 5(11%), and correctly measured in 36(80%) cases. Impact of BRCA 1 or 2 mutations on the quality of size estimation was not observed. Conclusions: Tumor size estimation using breas