WorldWideScience

Sample records for generalized mantel-haenszel analysis

  1. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  2. An adaptive Mantel-Haenszel test for sensitivity analysis in observational studies.

    Science.gov (United States)

    Rosenbaum, Paul R; Small, Dylan S

    2017-06-01

    In a sensitivity analysis in an observational study with a binary outcome, is it better to use all of the data or to focus on subgroups that are expected to experience the largest treatment effects? The answer depends on features of the data that may be difficult to anticipate, a trade-off between unknown effect-sizes and known sample sizes. We propose a sensitivity analysis for an adaptive test similar to the Mantel-Haenszel test. The adaptive test performs two highly correlated analyses, one focused analysis using a subgroup, one combined analysis using all of the data, correcting for multiple testing using the joint distribution of the two test statistics. Because the two component tests are highly correlated, this correction for multiple testing is small compared with, for instance, the Bonferroni inequality. The test has the maximum design sensitivity of two component tests. A simulation evaluates the power of a sensitivity analysis using the adaptive test. Two examples are presented. An R package, sensitivity2x2xk, implements the procedure. © 2016, The International Biometric Society.

  3. Mantel--Haenszel analysis of Oxford data. II. Independent effects of fetal irradiation subfactors

    International Nuclear Information System (INIS)

    Kneale, G.W.; Stewart, A.M.

    1976-01-01

    A Mantel-Haenszel analysis of fetal irradiation subfactors indicated that most of the extra x-rayed cases in the Oxford Survey of Childhood Cancers were radiation induced. First trimester exposures were rare but probably ten times more dangerous than later exposures. Ratios of observed: expected numbers of cancer deaths were lower for children with abnormal x-rays than for other x-rayed children, and lower for recent than remote exposures. The first of these differences was probably due to several antenatal conditions having positive associations with obstetric radiography and several causes of early (noncancer) deaths; the second one was probably due to a progressive lowering of film doses between 1940 and the present time. A rare cause of fetal irradiation (hydramnios), whose associations with congenital defects are well documented, led to the discovery that two faults in the International Classification of Diseases and Causes of Death have contributed to mistaken ideas about the etiology of childhood cancers: Neoplasms were not listed among the official causes of stillbirths, and cystic tumors of the kidneys and lungs of infants were not listed as neoplasms

  4. Comparison of Three Software Programs for Evaluating DIF by Means of the Mantel-Haenszel Procedure: EASY-DIF, DIFAS and EZDIF

    Science.gov (United States)

    Padilla, Jose Luis; Hidalgo, M. Dolores; Benitez, Isabel; Gomez-Benito, Juana

    2012-01-01

    The analysis of differential item functioning (DIF) examines whether item responses differ according to characteristics such as language and ethnicity, when people with matching ability levels respond differently to the items. This analysis can be performed by calculating various statistics, one of the most important being the Mantel-Haenszel,…

  5. Dose-Weighted Adjusted Mantel-Haenszel Tests for Numeric Scaled Strata in a Randomized Trial

    Science.gov (United States)

    Gansky, Stuart A.; Cheng, Nancy F.; Koch, Gary G.

    2011-01-01

    A recent three-arm parallel groups randomized clinical prevention trial had a protocol deviation causing participants to have fewer active doses of an in-office treatment than planned. The original statistical analysis plan stipulated a minimal assumption randomization-based extended Mantel-Haenszel (EMH) trend test of the high frequency, low frequency, and zero frequency treatment groups and a binary outcome. Thus a dose-weighted adjusted EMH (DWAEMH) test was developed with an extra set of weights corresponding to the number of active doses actually available, in the spirit of a pattern mixture model. The method can easily be implemented using standard statistical software. A set of Monte Carlo simulations using a logistic model was undertaken with (and without) actual dose-response effects through 1000 replicates for empirical power estimates (and 2100 for empirical size). Results showed size was maintained and power was improved for DWAEMH versus EMH and logistic regression Wald tests in the presence of a dose effect and treatment by dose interaction. PMID:21709814

  6. Type I Error Inflation in DIF Identification with Mantel-Haenszel: An Explanation and a Solution

    Science.gov (United States)

    Magis, David; De Boeck, Paul

    2014-01-01

    It is known that sum score-based methods for the identification of differential item functioning (DIF), such as the Mantel-Haenszel (MH) approach, can be affected by Type I error inflation in the absence of any DIF effect. This may happen when the items differ in discrimination and when there is item impact. On the other hand, outlier DIF methods…

  7. Sensitivity of Mantel Haenszel Model and Rasch Model as Viewed From Sample Size

    OpenAIRE

    ALWI, IDRUS

    2011-01-01

    The aims of this research is to study the sensitivity comparison of Mantel Haenszel and Rasch Model for detection differential item functioning, observed from the sample size. These two differential item functioning (DIF) methods were compared using simulate binary item respon data sets of varying sample size,  200 and 400 examinees were used in the analyses, a detection method of differential item functioning (DIF) based on gender difference. These test conditions were replication 4 tim...

  8. MANTEL-HAENSZEL TYPE ESTIMATORS FOR THE COUNTER-MATCHED SAMPLING DESIGN IN NESTED CASE-CONTROL STUDY

    OpenAIRE

    Fujii, Yoshinori; Zhang, Zhong-Zhan; 藤井, 良宜

    2001-01-01

    We are concerned with a counter-matched nested case-control study. Assuming the proportional hazards model, the Mantel-Haenszel estimators of hazard rates are presented in two situations. The proposed estimators can be calculated without estimating the nuisance parameter. Consistent estimators of the variance of the proposed hazard rate estimators are also developed. We compare these estimators to the maximum partial likelihood estimators in the asymptotic variance. The methods are illustrate...

  9. Power and sample size evaluation for the Cochran-Mantel-Haenszel mean score (Wilcoxon rank sum) test and the Cochran-Armitage test for trend.

    Science.gov (United States)

    Lachin, John M

    2011-11-10

    The power of a chi-square test, and thus the required sample size, are a function of the noncentrality parameter that can be obtained as the limiting expectation of the test statistic under an alternative hypothesis specification. Herein, we apply this principle to derive simple expressions for two tests that are commonly applied to discrete ordinal data. The Wilcoxon rank sum test for the equality of distributions in two groups is algebraically equivalent to the Mann-Whitney test. The Kruskal-Wallis test applies to multiple groups. These tests are equivalent to a Cochran-Mantel-Haenszel mean score test using rank scores for a set of C-discrete categories. Although various authors have assessed the power function of the Wilcoxon and Mann-Whitney tests, herein it is shown that the power of these tests with discrete observations, that is, with tied ranks, is readily provided by the power function of the corresponding Cochran-Mantel-Haenszel mean scores test for two and R > 2 groups. These expressions yield results virtually identical to those derived previously for rank scores and also apply to other score functions. The Cochran-Armitage test for trend assesses whether there is an monotonically increasing or decreasing trend in the proportions with a positive outcome or response over the C-ordered categories of an ordinal independent variable, for example, dose. Herein, it is shown that the power of the test is a function of the slope of the response probabilities over the ordinal scores assigned to the groups that yields simple expressions for the power of the test. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Nonparametric analysis of blocked ordered categories data: some examples revisited

    Directory of Open Access Journals (Sweden)

    O. Thas

    2006-08-01

    Full Text Available Nonparametric analysis for general block designs can be given by using the Cochran-Mantel-Haenszel (CMH statistics. We demonstrate this with four examples and note that several well-known nonparametric statistics are special cases of CMH statistics.

  11. Evaluation of the effectiveness of laser in situ keratomileusis and photorefractive keratectomy for myopia : A meta-analysis

    OpenAIRE

    Yang, Xin-Jun; Yan, Hong-Tao; Nakahori, Yutaka

    2003-01-01

    Objective: To evaluate the effectiveness of laser in situ keratomileusis (LASIK) and photorefractive keratectomy (PRK) for correcting myopia. Methods : Study selection, data extraction, and quality assessment were performed by two of authors independently. Summary odds ratios and 95% confidence intervals were calculated by DerSimonian amp Laird random-effects model and Mantel-Haenszel (fixed-effects) model. All calculations were based on an intention-to-treat and per protocol analysis. Result...

  12. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Efficacy of aprepitant for prevention of postoperative nausea and vomiting. Systematic review and meta-analysis of randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Berrío Valencia, Marta Inés

    2014-10-01

    Full Text Available Objective: To evaluate the efficacy of aprepitant compared with other antiemetics for the prevention of postoperative nausea and vomiting in adults who underwent general anesthesia. Methods: Systematic review of randomized clinical trials with meta-analysis, that evaluated the efficacy of aprepitant in comparatison with other antiemetics for the prevention of postoperative nausea and vomiting, antiemetic rescue and adverse effects. The search was done in The Cochrane Library, EBSCO, EMBASE, LILACS, OVID, PubMed, SciELO, ScienceDirect, Scopus and Google Scholar. Heterogeneity was defined with the Cochran Q and I2 statistic, the model fixed and random effects were used, the Mantel-Haenszel for relative risk of each outcome and its respective confidence interval 95% were used. Results: There was significant difference in favor of aprepitant for the prevention of vomiting at 24 (RR 0.52; 95% CI: 0.38-0.7 and at 48 hours (RR 0.51; 95% CI: 0.39 to 0.67 but not for nausea at 24 hours (RR 1.16; 95% CI: 0.85-1.6. Conclusions: Aprepitant prevents postoperative vomiting, but not nausea, at 24 and 48 hours.

  14. Beta-binomial model for meta-analysis of odds ratios.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena

    2017-05-20

    In meta-analysis of odds ratios (ORs), heterogeneity between the studies is usually modelled via the additive random effects model (REM). An alternative, multiplicative REM for ORs uses overdispersion. The multiplicative factor in this overdispersion model (ODM) can be interpreted as an intra-class correlation (ICC) parameter. This model naturally arises when the probabilities of an event in one or both arms of a comparative study are themselves beta-distributed, resulting in beta-binomial distributions. We propose two new estimators of the ICC for meta-analysis in this setting. One is based on the inverted Breslow-Day test, and the other on the improved gamma approximation by Kulinskaya and Dollinger (2015, p. 26) to the distribution of Cochran's Q. The performance of these and several other estimators of ICC on bias and coverage is studied by simulation. Additionally, the Mantel-Haenszel approach to estimation of ORs is extended to the beta-binomial model, and we study performance of various ICC estimators when used in the Mantel-Haenszel or the inverse-variance method to combine ORs in meta-analysis. The results of the simulations show that the improved gamma-based estimator of ICC is superior for small sample sizes, and the Breslow-Day-based estimator is the best for n⩾100. The Mantel-Haenszel-based estimator of OR is very biased and is not recommended. The inverse-variance approach is also somewhat biased for ORs≠1, but this bias is not very large in practical settings. Developed methods and R programs, provided in the Web Appendix, make the beta-binomial model a feasible alternative to the standard REM for meta-analysis of ORs. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  15. A systematic review and meta-analysis to compare the efficacy of acyclovir 3% ophthalmic ointment to idoxuridine in curing herpetic keratitis by Day 7 of treatment.

    Science.gov (United States)

    Balderson, Diane E; Cai, Gengqian; Fries, Michael A; Kleinman, David M; McLaughlin, Megan M; Trivedi, Trupti M; Wurzelmann, John I; Young, Sheila B

    2015-04-17

    This objective of the review and analysis is to demonstrate that acyclovir (ACV) 3% ophthalmic ointment is superior to idoxuridine (IDU) in treating herpetic keratitis (HK) presenting as dendritic and geographic ulcer sub-types. Publications in human subjects were identified by searching the Ovid MEDLINE database through April 2011, combining medical subject headings (MESH) "Keratitis, Herpetic/" AND "Acyclovir/" limiting by the key words "topical" OR "ointment" and also restricted to MESH "Administration, Topical/" OR "Ointments/". The results were cross checked with the references used in the Cochrane Database Syst Rev. 1:1-134, 2009 and GlaxoSmithKline clinical documents related to acyclovir. Randomized, double-masked studies in subjects diagnosed with HK with head to head comparator arms of ACV ophthalmic ointment and topical IDU that had actual or calculable healing rates at Day seven. Data independently extracted from identified articles by two authors of this manuscript. Data from seven randomized, controlled trials (RCT) evaluating 432 subjects that met inclusion criteria (214 were treated with ACV and 218 were treated with IDU) and had Day seven healing rates calculable. All sub-classified lesions were identified as either dendritic ulcers (n = 185) or geographic ulcers (n = 35). The Cochran-Mantel-Haenszel (CMH) method in Biometrics 10:417-51, 1954 and JNCI 22:719-48, 1959, controlling for study, was performed as the primary analysis using SAS v9. Homogeneity was assessed using Breslow-Day-Tarone (BDT) test in IARC 1:1-32, 1980 and Biometrika 72:91-5, 1985. The analysis was performed with outliers removed to assess their impact. ACV showed statistically significant greater odds of healing HK at Day seven in all subjects (Odds Ratio 3.95, 95% CI2.60, 6.00, p p p = 0.0244). ACV 3% ophthalmic ointment is a valuable intervention for dendritic and geographic corneal ulcers. ACV and IDU were generally well tolerated in the studies reviewed.

  16. Effects of Exercise on Mild-to-Moderate Depressive Symptoms in the Postpartum Period: A Meta-analysis.

    Science.gov (United States)

    McCurdy, Ashley P; Boulé, Normand G; Sivak, Allison; Davenport, Margie H

    2017-06-01

    To examine the influence of exercise on depressive symptoms and the prevalence of depression in the postpartum period. A structured search of MEDLINE, EMBASE, CINAHL, Sport Discus, Ovid's All EBM Reviews, and ClinicalTrials.gov databases was performed with dates from the beginning of the databases until June 16, 2016. The search combined keywords and MeSH-like terms including, but not limited to, "exercise," "postpartum," "depression," and "randomized controlled trial." Randomized controlled trials comparing postpartum exercise (structured, planned, repetitive physical activity) with the standard care for which outcomes assessing depressive symptoms or depressive episodes (as defined by trial authors) were assessed. Trials were identified as prevention trials (women from the general postpartum population) or treatment trials (women were classified as having depression by the trial authors). Effect sizes with 95% confidence intervals (CIs) were calculated using Hedges' g method and standardized mean differences in postintervention depression outcomes were pooled using a random-effects model. Across all 16 trials (1,327 women), the pooled standardized mean difference was -0.34 (95% CI -0.50 to -0.19, I=37%), suggesting a small effect of exercise among all postpartum women on depressive symptoms. Among the 10 treatment trials, a moderate effect size of exercise on depressive symptoms was found (standardized mean difference-0.48, 95% CI -0.73 to -0.22, I=42%). In six prevention trials, a small effect (standardized mean difference-0.22, 95% CI -0.36 to -0.08, I=2%) was found. In women with depression preintervention, exercise increased the odds of resolving depression postintervention by 54% (odds ratio 0.46, Mantel-Haenszel method, 95% CI 0.25-0.84, I=0%). The trials included in this meta-analysis were small and some had methodologic limitations. Light-to-moderate intensity aerobic exercise improves mild-to-moderate depressive symptoms and increases the likelihood that

  17. Terbinafine in the treatment of dermatophyte toenail onychomycosis: a meta-analysis of efficacy for continuous and intermittent regimens.

    Science.gov (United States)

    Gupta, A K; Paquet, M; Simpson, F; Tavakkol, A

    2013-03-01

    To compare mycological and complete cures of terbinafine continuous and intermittent regimens in the treatment of toenail onychomycosis. The PubMed database was searched using the terms "terbinafine", "onychomycosis", "continuous" and "pulse(d)" or "intermittent". The inclusion criteria were head-to-head comparison of terbinafine pulse and continuous regimens for dermatophyte toenail infections. Risk ratios were calculated for intention-to-treat and evaluable patient analyses, when possible. Pooled estimates for total and subgroup analyses were calculated using a random effect model, Mantel-Haenszel method and their probabilities were calculated with z-statistics. Nine studies from eight publications were included. Two continuous regimens and four intermittent regimens were investigated. A pooled risk ratio of 0.87 was obtained for intention-to-treat (95% CI: 0.79-0.96, P = 0.004, n = 6) and evaluable patient (95% CI: 0.80-0.96, P = 0.003, n = 8) analyses of mycological cure, favouring continuous terbinafine. For complete cure, pooled risk ratios of 0.97 (95% CI: 0.77-1.23, P = 0.82, n = 7) for intention-to-treat and 0.93 (95% CI: 0.76-1.13, P = 0.44, n = 9) for evaluable patient analyses showed equality of the two regimens. The pulse regimen that demonstrated consistently comparable results to the continuous terbinafine regimen was two pulses of terbinafine 250 mg/day for 4 weeks on/4 weeks off. Meta-analysis of published studies of toenail onychomycosis showed that a continuous terbinafine regimen is generally significantly superior to a pulsed terbinafine regimen for mycological cure. In contrast, some pulse terbinafine regimens were as effective as continuous terbinafine regimens for complete cure. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.

  18. Effects of Misbehaving Common Items on Aggregate Scores and an Application of the Mantel-Haenszel Statistic in Test Equating. CSE Report 688

    Science.gov (United States)

    Michaelides, Michalis P.

    2006-01-01

    Consistent behavior is a desirable characteristic that common items are expected to have when administered to different groups. Findings from the literature have established that items do not always behave in consistent ways; item indices and IRT item parameter estimates of the same items differ when obtained from different administrations.…

  19. Sensitivity analysis for missing dichotomous outcome data in multi-visit randomized clinical trial with randomization-based covariance adjustment.

    Science.gov (United States)

    Li, Siying; Koch, Gary G; Preisser, John S; Lam, Diana; Sanchez-Kam, Matilde

    2017-01-01

    Dichotomous endpoints in clinical trials have only two possible outcomes, either directly or via categorization of an ordinal or continuous observation. It is common to have missing data for one or more visits during a multi-visit study. This paper presents a closed form method for sensitivity analysis of a randomized multi-visit clinical trial that possibly has missing not at random (MNAR) dichotomous data. Counts of missing data are redistributed to the favorable and unfavorable outcomes mathematically to address possibly informative missing data. Adjusted proportion estimates and their closed form covariance matrix estimates are provided. Treatment comparisons over time are addressed with Mantel-Haenszel adjustment for a stratification factor and/or randomization-based adjustment for baseline covariables. The application of such sensitivity analyses is illustrated with an example. An appendix outlines an extension of the methodology to ordinal endpoints.

  20. Adjusting for multiple prognostic factors in the analysis of randomised trials

    Science.gov (United States)

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not

  1. [Contrast-induced nephropathy in patients at risk of renal failure undergoing computed tomography: systematic review and meta-analysis of randomized controlled trials].

    Science.gov (United States)

    Arana, Estanislao; Catalá-López, Ferrán

    2010-09-11

    We evaluated and quantified by meta-analysis techniques the incidence of contrast-induced nephropathy (CIN) in patients at risk undergoing computed tomography (CT). We conducted a systematic review of randomized controlled clinical trials designated to evaluate the nephrotoxicity related to iso-osmolar contrast media (IOCM) compared to low-osmolar contrast media (LOCM). Main electronic databases searched included PubMed/MEDLINE, EMBASE, ISI Web of Knowledge and Virtual Health Library (BVS-BIREME), as well as abstracts presented at related scientific societies meetings. Prior to data extraction, definitions of nephrotoxicity and risk population were established. Besides meta-analysis, the global agreement between CIN definitions was evaluated with Mantel-Haenszel stratified test. Five studies were included with 716 randomized patients. When CIN was defined as increased serum creatinine (SCr)>or=25%, the relative risk (RR) was 0.71 (CI95%: 0.40-1.26)-in favor of IOCM-and when it was defined as SCr>or=0.5mg/dL it showed a RR 1.48 (CI95%: 0.37-5.87)-favoring LOCM-in the four studies used this criterion. Mantel-Haenszel stratified test was chi2=2.51 (p=0.8). In patients with renal failure undergoing CT there is a similar risk of CIN with the administration of any contrast media studied. CIN incidence depends on the chosen criteria and is lower with the definition of SCr>or=0.5mg/dL at 24-72h. No agreement was found between CIN definitions were adopted. Copyright © 2009 Elsevier España, S.L. All rights reserved.

  2. Generalized Linear Covariance Analysis

    Science.gov (United States)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  3. Effect of gargling with tea and ingredients of tea on the prevention of influenza infection: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Ide

    2016-05-01

    Full Text Available Abstract Background Influenza viruses can spread easily from person to person, and annual influenza epidemics are serious public health issues worldwide. Non-pharmaceutical public health interventions could potentially be effective for combatting influenza epidemics, but combined interventions and/or interventions with greater effectiveness are needed. Experimental studies have reported that tea and its ingredients (especially catechins have antiviral activities. Although several clinical studies have investigated the use of tea or its ingredients to prevent influenza infections, the effect of gargling these substances has remained uncertain. Methods We conducted a meta-analysis of randomized controlled studies and prospective cohort studies to assess the effect of gargling with tea and its ingredients on the prevention of influenza infection. The published literature was searched using the Cochrane Library, PubMed/MEDLINE (1966 to September 2015, Web of Science (1981 to September 2015, and Ichu-shi Web (1983 to September 2015. The extracted studies were read by two reviewers independently, and their overall scientific quality was evaluated. Studies meeting our inclusion criteria were pooled using the Mantel-Haenszel method in a fixed effects model and were also analyzed in a random effects model. The qualities of the model fits were assessed using the Akaike information criterion (AIC and Bayesian information criterion (BIC. Results The literature search and review identified 5 studies that met the inclusion criteria for the meta-analysis (total number of participants, 1890; mean age range, 16–83 years. The participants who gargled with tea or its ingredients showed a lower risk of influenza infection than did participants who gargled with placebo/water or who did not gargle (fixed effects model, Mantel-Haenszel method: relative risk [RR] = 0.70, 95 % confidence interval [CI] = 0.54–0.89; random effects model: RR = 0.71, 95

  4. Evaluation of the effectiveness of laser in situ keratomileusis and photorefractive keratectomy for myopia: a meta-analysis.

    Science.gov (United States)

    Yang, Xin-Jun; Yan, Hong-Tao; Nakahori, Yutaka

    2003-08-01

    To evaluate the effectiveness of laser in situ keratomileusis (LASIK) and photorefractive keratectomy (PRK) for correcting myopia. Study selection, data extraction, and quality assessment were performed by two of authors independently. Summary odds ratios and 95% confidence intervals were calculated by DerSimonian & Laird random-effects model and Mantel-Haenszel (fixed-effects) model. All calculations were based on an intention-to-treat and per protocol analysis. Five hundred and eighty eyes (476 patients) from 5 randomized controlled trials were included in this study. At > or = 6 months follow-up, by random-effects model, the pooled odds ratios (OR, for LASIK vs. PRK) of postoperative uncorrected visual acuity (UCVA) of 20/20 or better for all trials were 1.31 (95% CI=0.77-2.22) by per protocol analysis and 1.18 (95% CI=0.74-1.88) by intention-to-treat analysis. In the refractive outcome, the pooled OR of the postoperative spherical equivalent refraction within +/-0.5 diopter (D) of emmetropia did not show any statistical significance, for which the OR were 0.75 (95% CI=0.48-1.18) by per protocol analysis and 0.70 (95% CI=0.47-1.04) by intention-to-treat analysis. LASIK and PRK were found to be similarly effective for the correction of myopia from -1.5 to -15.0 D in a greater than 6 month follow-up.

  5. Patellar denervation with electrocautery in total knee arthroplasty without patellar resurfacing: a meta-analysis.

    Science.gov (United States)

    Cheng, Tao; Zhu, Chen; Guo, Yongyuan; Shi, Sifeng; Chen, Desheng; Zhang, Xianlong

    2014-11-01

    The impact of patellar denervation with electrocautery in total knee arthroplasty (TKA) on post-operative outcomes has been under debate. This study aims to conduct a meta-analysis and systematic review to compare the benefits and risks of circumpatellar electrocautery with those of non-electrocautery in primary TKAs. Comparative and randomized clinical studies were identified by conducting an electronic search of articles dated up to September 2012 in PubMed, EMBASE, Scopus, and the Cochrane databases. Six studies that focus on a total of 849 knees were analysed. A random-effects model was conducted using the inverse-variance method for continuous variables and the Mantel-Haenszel method for dichotomous variables. There was no significant difference in the incidence of anterior knee pain between the electrocautery and non-electrocautery groups. In term of patellar score and Knee Society Score, circumpatellar electrocautery improved clinical outcomes compared with non-electrocautery in TKAs. The statistical differences were in favour of the electrocautery group but have minimal clinical significance. In addition, the overall complications indicate no statistical significance between the two groups. This study shows no strong evidence either for or against electrocautery compared with non-electrocautery in TKAs. Therapeutic study (systematic review and meta-analysis), Level III.

  6. Transversus abdominis plane (TAP) block in laparoscopic colorectal surgery improves postoperative pain management: a meta-analysis.

    Science.gov (United States)

    Hain, E; Maggiori, L; Prost À la Denise, J; Panis, Y

    2018-04-01

    Transversus abdominis plane (TAP) block is a locoregional anaesthesia technique of growing interest in abdominal surgery. However, its efficacy following laparoscopic colorectal surgery is still debated. This meta-analysis aimed to assess the efficacy of TAP block after laparoscopic colorectal surgery. All comparative studies focusing on TAP block after laparoscopic colorectal surgery have been systematically identified through the MEDLINE database, reviewed and included. Meta-analysis was performed according to the Mantel-Haenszel method for random effects. End-points included postoperative opioid consumption, morbidity, time to first bowel movement and length of hospital stay. A total of 13 studies, including 7 randomized controlled trials, were included, comprising a total of 600 patients who underwent laparoscopic colorectal surgery with TAP block, compared with 762 patients without TAP block. Meta-analysis of these studies showed that TAP block was associated with a significantly reduced postoperative opioid consumption on the first day after surgery [weighted mean difference (WMD) -14.54 (-25.14; -3.94); P = 0.007] and a significantly shorter time to first bowel movement [WMD -0.53 (-0.61; -0.44); P plane (TAP) block in laparoscopic colorectal surgery improves postoperative opioid consumption and recovery of postoperative digestive function without any significant drawback. Colorectal Disease © 2018 The Association of Coloproctology of Great Britain and Ireland.

  7. Generalized seismic analysis

    Science.gov (United States)

    Butler, Thomas G.

    1993-09-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  8. Maternal SSRI exposure increases the risk of autistic offspring: A meta-analysis and systematic review.

    Science.gov (United States)

    Andalib, S; Emamhadi, M R; Yousefzadeh-Chabok, S; Shakouri, S K; Høilund-Carlsen, P F; Vafaee, M S; Michel, T M

    2017-09-01

    Selective serotonin reuptake inhibitors (SSRIs) are the most common antidepressants used to preclude maternal pregnancy depression. There is a growing body of literature assessing the association of prenatal exposure to SSRIs with autism spectrum disorder (ASD). The present systematic review and meta-analysis reviewed the medical literature and pooled the results of the association of prenatal exposure to SSRIs with ASD. Published investigations in English by June 2016 with keywords of selective serotonin reuptake inhibitors, SSRI, autism spectrum disorder, ASD, pregnancy, childhood, children, neurodevelopment were identified using databases PubMed and PMC, MEDLINE, EMBASE, SCOPUS, and Google Scholar. Cochran's Q statistic-value (Q), degree of freedom (df), and I 2 indices (variation in odds ratio [OR] attributable to heterogeneity) were calculated to analyze the risk of heterogeneity of the within- and between-study variability. Pooled odds ratio (OR) and 95% confidence interval (CI) were reported by a Mantel-Haenszel test. There was a non-significant heterogeneity for the included studies ([Q=3.61, df=6, P=0.730], I 2 =0%). The pooled results showed a significant association between prenatal SSRI exposure and ASD (OR=1.82, 95% CI=1.59-2.10, Z=8.49, P=0.00). The evidence from the present study suggests that prenatal exposure to SSRIs is associated with a higher risk of ASD. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  9. Prognostic value of HER-2/neu expression in epithelial ovarian cancer: a systematic review and meta-analysis.

    Science.gov (United States)

    Wang, Kai; Guan, Chenan; Yu, Junhui; Jin, Xiaoxiao; Sun, Ling; Zheng, Lingzhi; Xia, Liang; Zhang, Yuquan

    2017-09-26

    This study aimed to conduct a meta-analysis to investigate the association between human epidermal growth factor receptor 2 (HER-2/neu) expression and survival in patients with epithelial ovarian cancer (EOC). HER-2/neu is one of the most frequently studied molecular biological parameters in EOC, but its prognostic impact has not been fully assessed. PubMed and Embase were searched for studies that reported HER-2/neu expression and survival in patients with EOC. The primary outcome was overall survival (OS), and the secondary outcome was progression-free survival (PFS). Hazard ratios (HRs) with 95% confidence interval (CI) were determined using Mantel-Haenszel random-effects model. Publication bias was investigated using funnel plots and Egger's test. A total of 56 studies (N=7212) were included in the analysis. The results showed that patients possessing HER-2/neu expression had significant disadvantages in OS (HR = 1.41; 95%CI, 1.31 to 1.51; P present study findings provided further indication that HER-2/neu expression in patients with EOC has an adverse impact on OS and PFS.

  10. Sodium-glucose cotransporter 2 (SGLT2) inhibitors and fracture risk in patients with type 2 diabetes mellitus: A meta-analysis.

    Science.gov (United States)

    Ruanpeng, Darin; Ungprasert, Patompong; Sangtian, Jutarat; Harindhanavudhi, Tasma

    2017-09-01

    Sodium-glucose cotransporter 2 (SGLT2) inhibitors could potentially alter calcium and phosphate homeostasis and may increase the risk of bone fracture. The current meta-analysis was conducted to investigate the fracture risk among patients with type 2 diabetes mellitus treated with SGLT2 inhibitors. Randomized controlled trials that compared the efficacy of SGLT2 inhibitors to placebo were identified. The risk ratios of fracture among patients who received SGLT2 inhibitors versus placebo were extracted from each study. Pooled risk ratios and 95% confidence intervals were calculated using a random-effect, Mantel-Haenszel analysis. A total of 20 studies with 8286 patients treated with SGLT2 inhibitors were included. The pooled risk ratio of bone fracture in patients receiving SGLT2 inhibitors versus placebo was 0.67 (95% confidence interval, 0.42-1.07). The pooled risk ratio for canagliflozin, dapagliflozin, and empagliflozin was 0.66 (95% confidence interval, 0.37-1.19), 0.84 (95% confidence interval, 0.22-3.18), and 0.57 (95% confidence interval, 0.20-1.59), respectively. Increased risk of bone fracture among patients with type 2 diabetes mellitus treated with SGLT2 inhibitors compared with placebo was not observed in this meta-analysis. However, the results were limited by short duration of treatment/follow-up and low incidence of the event of interest. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Risk factors for postoperative delirium in patients undergoing major head and neck cancer surgery: a meta-analysis.

    Science.gov (United States)

    Zhu, Yun; Wang, Gangpu; Liu, Shengwen; Zhou, Shanghui; Lian, Ying; Zhang, Chenping; Yang, Wenjun

    2017-06-01

    Postoperative delirium is common after extensive surgery. This study aimed to collate and synthesize published literature on risk factors for delirium in patients with head and neck cancer surgery. Three databases were searched (MEDLINE, Embase, and Cochrane Library) between January 1987 and July 2016. The Newcastle Ottawa Scale (NOS) was adopted to evaluate the study quality. Pooled odds ratios or mean differences for individual risk factors were estimated using the Mantel-Haenszel and inverse-variance methods. They provided a total of 1940 patients (286 with delirium and 1654 without), and predominantly included patients undergoing head and neck cancer surgery. The incidence of postoperative delirium ranged from 11.50% to 36.11%. Ten statistically significant risk factors were identified in pooled analysis. Old age, age >70 years, male sex, duration of surgery, history of hypertension, blood transfusions, tracheotomy, American Society of Anesthesiologists physical status grade at least III, flap reconstruction and neck dissection were more likely to sustain delirium after head and neck cancer surgery. Delirium is common in patients undergoing major head neck cancer surgery. Several risk factors were consistently associated with postoperative delirium. These factors help to highlight patients at risk of developing delirium and are suitable for preventive action. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  12. Transform analysis of generalized functions

    CERN Document Server

    Misra, O P

    1986-01-01

    Transform Analysis of Generalized Functions concentrates on finite parts of integrals, generalized functions and distributions. It gives a unified treatment of the distributional setting with transform analysis, i.e. Fourier, Laplace, Stieltjes, Mellin, Hankel and Bessel Series.Included are accounts of applications of the theory of integral transforms in a distributional setting to the solution of problems arising in mathematical physics. Information on distributional solutions of differential, partial differential equations and integral equations is conveniently collected here.The volume will

  13. Efficacy and safety profile of antibiotic prophylaxis usage in clean and clean-contaminated plastic and reconstructive surgery: a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Zhang, Yi; Dong, Jiasheng; Qiao, Yufei; He, Jinguang; Wang, Tao; Ma, Sunxiang

    2014-01-01

    There is no consensus with regard to antibiotic prophylaxis usage in clean and clean-contaminated plastic and reconstructive surgery. This meta-analysis sought to assess the efficacy and safety of antibiotic prophylaxis and to determine appropriate duration of prophylaxis. An English language literature search was conducted using PubMed and the Cochrane Collaboration for randomized controlled trials (RCTs) that evaluate the use of antibiotic prophylaxis to prevent postoperative surgical site infection (SSI) in patients undergoing clean and clean-contaminated plastic and reconstructive surgery. Data from intention-to-treat analyses were used where available. For the dichotomous data, results for each study were odds ratio (OR) with 95% confidence interval (CI) and combined for meta-analysis using the Mantel-Haenszel method or the DerSimonian and Laird method. Study quality was critically appraised by 2 reviewers using established criteria. STATA version 12 was used for meta-analyses. Twelve RCTs involving 2395 patients were included, of which 8 trials were considered to be of high methodological quality. Effect of antibiotic prophylaxis in plastic and reconstructive surgery was found favorable over placebo in SSI prevention (13 studies; 2449 participants; OR, 0.53; 95% CI, 0.4-0.7; P plastic surgeries with high-risk factors and clean-contaminated plastic surgeries. Besides, a short-course administration regimen seemed to be of adequate efficacy and safety. High-quality prospective trials on larger scale are needed to further confirm these findings.

  14. Does prolonged β-lactam infusions improve clinical outcomes compared to intermittent infusions? A meta-analysis and systematic review of randomized, controlled trials

    Directory of Open Access Journals (Sweden)

    Van Arendonk Kyle J

    2011-06-01

    Full Text Available Abstract Background The emergence of multi-drug resistant Gram-negatives (MDRGNs coupled with an alarming scarcity of new antibiotics has forced the optimization of the therapeutic potential of available antibiotics. To exploit the time above the minimum inhibitory concentration mechanism of β-lactams, prolonging their infusion may improve outcomes. The primary objective of this meta-analysis was to determine if prolonged β-lactam infusion resulted in decreased mortality and improved clinical cure compared to intermittent β-lactam infusion. Methods Relevant studies were identified from searches of MEDLINE, EMBASE, and CENTRAL. Heterogeneity was assessed qualitatively, in addition to I2 and Chi-square statistics. Pooled relative risks (RR and 95% confidence intervals (CI were calculated using Mantel-Haenszel random-effects models. Results Fourteen randomized controlled trials (RCTs were included. Prolonged infusion β-lactams were not associated with decreased mortality (n= 982; RR 0.92; 95% CI:0.61-1.37 or clinical cure (n = 1380; RR 1.00 95% CI:0.94-1.06 compared to intermittent infusions. Subgroup analysis for β-lactam subclasses and equivalent total daily β-lactam doses yielded similar results. Most studies had notable methodological flaws. Conclusions No clinical advantage was observed for prolonged infusion β-lactams. The limited number of studies with MDRGNs precluded evaluation of prolonged infusion of β-lactams for this subgroup. A large, multicenter RCT with critically ill patients infected with MDRGNs is needed.

  15. Is Ki-67 of Diagnostic Value in Distinguishing Between Partial and Complete Hydatidiform Moles? A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Zhao, Yue; Xiong, Guang-Wu; Zhang, Xiao-Wei; Hang, B O

    2018-02-01

    To demonstrate the value of Ki-67 in distinguishing between partial and complete hydatidiform moles. We searched electronic databases included Medline, WOK, Cochrane Library and CNKI, through January 24, 2015. Experts were consulted, and references from related articles were examined. The meta-analysis was conducted with RevMan5.3, according to the PRISMA guidelines. Mantel-Haenszel estimates were calculated and pooled under a random effect model, with data expressed as odds ratio (OR) and 95% confidence interval (CI). We analyzed eight trials with a total of 337 participants who underwent uterine curettage and met the inclusion criteria. A significantly higher expression of Ki-67 was observed in complete than in partial hydatidiform moles (OR=3.28; 95%CI=1.80-5.96; pvalue in distinguishing between partial and complete hydatidiform moles. However, the present study had only a limited number of samples, so investigation of a greater number of cases is needed to confirm this conclusion. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  16. Effect of socioeconomic level on knowledge of stroke in the general population: A social inequality gradient.

    Science.gov (United States)

    Ramírez-Moreno, J M; Alonso-González, R; Peral Pacheco, D; Millán-Nuñez, M V; Roa-Montero, A; Constantino-Silva, A B; Aguirre-Sánchez, J J

    2016-01-01

    Socioeconomic status is a factor that influences health-related behaviour in individuals as well as health conditions in entire populations. The objective of the present study was to analyse the sociodemographic factors that may influence knowledge of stroke. Cross-sectional study. A representative sample was selected by double randomisation. Face-to-face interviews were carried out by previously trained medical students using a structured questionnaire with open- and closed-ended questions. Adequate knowledge was previously defined. The Mantel-Haenszel test and adjusted logistic regression analysis were used to assess the association between knowledge of stroke and the study variables. 2411 subjects were interviewed (59.9% women; mean age 49.0 [SD 17.3] years) Seventy-three per cent were residents of urban areas, 24.7% had a university education, and 15.2% had a low level of schooling. Only 2.1% reported earning more than 40 000 euros/year, with 29.9% earning less than 10 000. Nearly 74% reported having an excellent or good state of health. The unemployment rate was 17.0%. Prevalence of "adequate knowledge" was 39.7% (95% CI: 37.7%-41.6%). Trend analysis showed an association between knowledge of stroke and income (z=10.14, P<0.0001); educational level (z=15.95, P<0.0001); state of health (z=7.92, P<0.0001); and employment status (z=8.98, P<0.0001). Educational level, income, employment status, and state of health are independent factors for adequate knowledge of stroke. Public awareness campaigns should present material using simple language and efforts should be directed toward the most disadvantaged social strata in particular. Copyright © 2014 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  17. Risk of endometrial, ovarian and breast cancer in women with polycystic ovary syndrome: a systematic review and meta-analysis.

    Science.gov (United States)

    Barry, John A; Azizia, Mallika M; Hardiman, Paul J

    2014-01-01

    Polycystic ovary syndrome (PCOS) is a common condition affecting ∼8% of women. The objective of the present study was to quantify separately the risk of endometrial cancer, ovarian cancer and breast cancer in women with PCOS compared with non-PCOS controls, and quantify separately the risk to women of all ages as well as the risk to premenopausal women. We conducted a systematic review and meta-analysis of observational studies. Studies were eligible for inclusion if they compared women with PCOS to non-PCOS groups for fatal or non-fatal gynaecological cancers. Studies listed in MEDLINE and EMBASE published up to 7 October 2013 in any language were identified, and relevant papers were also searched by hand. Relevant data (for example, study design, source of control data, diagnostic criteria) were extracted and tabulated. From 698 references, 11 studies (5 of endometrial cancer and 3 each of ovarian and breast cancer) met the inclusion criteria for the meta-analysis (919 women with PCOS and 72054 non-PCOS controls). Using the Mantel-Haenszel method, with fixed or random effects model as appropriate, women with PCOS were at a significantly increased risk of endometrial cancer (odds ratio (OR), 2.79; 95% confidence interval (CI), 1.31-5.95, P cancers was not significantly increased (OR, 1.41; 95% CI, 0.93-2.15, P cancer (OR, 4.05; 95% CI, 2.42-6.76, P cancer (OR, 2.52; 95% CI, 1.08-5.89, P cancer (OR, 0.78; 95% CI, 0.46-1.32, P cancers in women with PCOS younger than 54 years of age compared with controls of similar age. Current data suggest that women of all ages with PCOS are at an increased risk of endometrial cancer but the risk of ovarian and breast cancer was not significantly increased overall. These results highlight the potential risk of gynaecological cancer morbidities associated with PCOS. However, the available evidence is far from robust and variation in diagnostic criteria for PCOS, associated risk factors (particularly obesity), and selection bias

  18. Delayed initiation of antenatal care and associated factors in Ethiopia: a systematic review and meta-analysis.

    Science.gov (United States)

    Tesfaye, Gezahegn; Loxton, Deborah; Chojenta, Catherine; Semahegn, Agumasie; Smith, Roger

    2017-11-15

    Antenatal care uptake is among the key indicators for monitoring the progress of maternal outcomes. Early initiation of antenatal care facilitates the timely management and treatment of pregnancy complications to reduce maternal deaths. In Ethiopia, antenatal care utilization is generally low, and delayed initiation of care is very common. We aimed to systematically identify and synthesize available evidence on delayed initiation of antenatal care and the associated factors in Ethiopia. Studies published in English from 1 January 2002 to 30 April 2017 were systematically searched from PubMed, Medline, EMBASE, CINAHL and other relevant sources. Two authors independently reviewed the identified studies against the eligibility criteria. The included studies were critically appraised using the Joanna Briggs-MAStARI instrument for observational studies. Meta-analysis was conducted in RevMan v5.3 for Windows using a Mantel-Haenszel random effects model. The presence of statistical heterogeneity was checked using the Cochran Q test, and its level was quantified using the I 2 statistics. Pooled estimate of the proportion of the outcome variable was calculated. Pooled Odd Ratios with 95% CI were calculated to measure the effect sizes. The pooled magnitude of delayed antenatal care in Ethiopia was 64% (95% CI: 57%, 70%). Maternal age (OR = 0.70; 95% CI: 0.53, 0.93), place of residence (OR = 0.29, 95% CI: 0.16, 0.50), maternal education (OR = 0.49; 95% CI: 0.38, 0.63), husband's education (OR = 0.44; 95% CI: 0.23, 0.85), maternal occupation (OR = 0.75; 95% CI: 0.61, 0.93), monthly income (OR = 2.06; 95% CI: 1.23, 3.45), pregnancy intention (OR = 0.49; 95% CI: 0.40, 0.60), parity (OR = 0.46; 95% CI: 0.36, 0.58), knowledge of antenatal care (OR = 0.40; 95% CI: 0.32, 0.51), women's autonomy (OR = 0.38; 95% CI: 0.15, 0.94), partner involvement (OR = 0.24; 95% CI: 0.07, 0.75), pregnancy complications (OR = 0.23; 95% CI: 0.06, 0

  19. Functional Generalized Structured Component Analysis.

    Science.gov (United States)

    Suk, Hye Won; Hwang, Heungsun

    2016-12-01

    An extension of Generalized Structured Component Analysis (GSCA), called Functional GSCA, is proposed to analyze functional data that are considered to arise from an underlying smooth curve varying over time or other continua. GSCA has been geared for the analysis of multivariate data. Accordingly, it cannot deal with functional data that often involve different measurement occasions across participants and a large number of measurement occasions that exceed the number of participants. Functional GSCA addresses these issues by integrating GSCA with spline basis function expansions that represent infinite-dimensional curves onto a finite-dimensional space. For parameter estimation, functional GSCA minimizes a penalized least squares criterion by using an alternating penalized least squares estimation algorithm. The usefulness of functional GSCA is illustrated with gait data.

  20. Generalized canonical correlation analysis with missing values

    NARCIS (Netherlands)

    M. van de Velden (Michel); Y. Takane

    2012-01-01

    textabstractGeneralized canonical correlation analysis is a versatile technique that allows the joint analysis of several sets of data matrices. The generalized canonical correlation analysis solution can be obtained through an eigenequation and distributional assumptions are not required. When

  1. Using Loss Functions for DIF Detection: An Empirical Bayes Approach.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy; Lewis, Charles

    2000-01-01

    Studied a method for flagging differential item functioning (DIF) based on loss functions. Builds on earlier research that led to the development of an empirical Bayes enhancement to the Mantel-Haenszel DIF analysis. Tested the method through simulation and found its performance better than some commonly used DIF classification systems. (SLD)

  2. Is desvenlafaxine effective and safe in the treatment of menopausal ...

    African Journals Online (AJOL)

    Meta-analysis was conducted by including double-blind randomized controlled studies on the effectiveness and safety of desvenlafaxine in the treatment of hot flashes. The effectiveness, safety and tolerability of desvenlafaxine were determined by standardized mean differences (SMDs) and Mantel-Haenszel odds ratio.

  3. DIF Trees: Using Classification Trees to Detect Differential Item Functioning

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Qiu

    2010-01-01

    A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…

  4. Clinical Validation of the "Sedentary Lifestyle" Nursing Diagnosis in Secondary School Students

    Science.gov (United States)

    de Oliveira, Marcos Renato; da Silva, Viviane Martins; Guedes, Nirla Gomes; de Oliveira Lopes, Marcos Venícios

    2016-01-01

    This study clinically validated the nursing diagnosis of "sedentary lifestyle" (SL) among 564 Brazilian adolescents. Measures of diagnostic accuracy were calculated for defining characteristics, and Mantel--Haenszel analysis was used to identify related factors. The measures of diagnostic accuracy showed that the following defining…

  5. Prophylactic mesh to prevent parastomal hernia after end colostomy: a meta-analysis and trial sequential analysis.

    Science.gov (United States)

    López-Cano, M; Brandsma, H-T; Bury, K; Hansson, B; Kyle-Leinhase, I; Alamino, J G; Muysoms, F

    2017-04-01

    Prevention of parastomal hernia (PSH) formation is crucial, given the high prevalence and difficulties in the surgical repair of PSH. To investigate the effect of a preventive mesh in PSH formation after an end colostomy, we aimed to meta-analyze all relevant randomized controlled trials (RCTs). We searched five databases. For each trial, we extracted risk ratios (RRs) of the effects of mesh or no mesh. The primary outcome was incidence of PSH with a minimum follow-up of 12 months with a clinical and/or computed tomography diagnosis. RRs were combined using the random-effect model (Mantel-Haenszel). To control the risk of type I error, we performed a trial sequential analysis (TSA). Seven RCTs with low risk of bias (451 patients) were included. Meta-analysis for primary outcome showed a significant reduction of the incidence of PSH using a mesh (RR 0.43, 95% CI 0.26-0.71; P = 0.0009). Regarding TSA calculation for the primary outcome, the accrued information size (451) was 187.1% of the estimated required information size (RIS) (241). Wound infection showed no statistical differences between groups (RR 0.77, 95% CI 0.39-1.54; P = 0.46). PSH repair rate showed a significant reduction in the mesh group (RR 0.28 (95% CI 0.10-0.78; P = 0.01). PSH prevention with mesh when creating an end colostomy reduces the incidence of PSH, the risk for subsequent PSH repair and does not increase wound infections. TSA shows that the RIS is reached for the primary outcome. Additional RCTs in the previous context are not needed.

  6. Dimensional Analysis and General Relativity

    Science.gov (United States)

    Lovatt, Ian

    2009-01-01

    Newton's law of gravitation is a central topic in the first-year physics curriculum. A lecturer can go beyond the physical details and use the history of gravitation to discuss the development of scientific ideas; unfortunately, the most recent chapter in this history, general relativity, is not covered in first-year courses. This paper discusses…

  7. Human papillomavirus infection and the malignant transformation of sinonasal inverted papilloma: A meta-analysis.

    Science.gov (United States)

    Zhao, Ren-Wu; Guo, Zhi-Qiang; Zhang, Ru-Xin

    2016-06-01

    A growing number of molecular epidemiological studies have been conducted to evaluate the association between human papillomavirus (HPV) infection and the malignancy of sinonasal inverted papilloma (SNIP). However, the results remain inconclusive. Here, a meta-analysis was conducted to quantitatively assess this association. Case-control studies investigating SNIP tissues for presence of HPV DNA were identified. The odds ratios (ORs) and 95% confidence intervals (CIs) were calculated by the Mantel-Haenszel method. An assessment of publication bias and sensitivity analysis were also performed. We calculated a pooled OR of 2.16 (95% CI=1.46-3.21, P<0.001) without statistically significant heterogeneity or publication bias. Stratification by HPV type showed a stronger association for patients with high-risk HPV (hrHPV) types, HPV-16, HPV-18, and HPV-16/18 infection (OR=8.8 [95% CI: 4.73-16.38], 8.04 [95% CI: 3.34-19.39], 18.57 [95% CI: 4.56-75.70], and 26.24 [4.35-158.47], respectively). When only using PCR studies, pooled ORs for patients with hrHPV, HPV-16, and HPV18 infection still reached statistical significance. However, Egger's test reflected significant publication bias in the HPV-16 sub-analysis (P=0.06), and the adjusted OR was no longer statistically significant (OR=1.65, 95%CI: 0.58-4.63). These results suggest that HPV infection, especially hrHPV (HPV-18), is significantly associated with malignant SNIP. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Outcomes following polyetheretherketone (PEEK) cranioplasty: Systematic review and meta-analysis.

    Science.gov (United States)

    Punchak, Maria; Chung, Lawrance K; Lagman, Carlito; Bui, Timothy T; Lazareff, Jorge; Rezzadeh, Kameron; Jarrahy, Reza; Yang, Isaac

    2017-07-01

    Polyetheretherketone (PEEK) has been used in cranioplasty since the early 2000s. However, there remains limited data that compares its long-term complication rate to autologous grafts and titanium mesh implants. To compare complication and implant failure rates after PEEK, autologous and titanium mesh cranioplasties, the authors of this study conducted a systematic review using the PubMed database. Studies that contained outcome data on complication rates of PEEK cranioplasty patients and studies that compared outcomes of patients who underwent PEEK cranioplasties versus other materials were included in the meta-analysis. Pooled odds ratios using the Mantel-Haenszel method were used for analysis. Fifteen articles, comprised of 183 PEEK cranioplasty patients were included. Of these patients, 15.3% developed post-operative complications and 8.7% experienced implant failure requiring reoperation. Patients who underwent cranioplasties with PEEK implants had 0.130 times the odds of developing post-operative complications (P=0.065) and 0.574 times the odds of implant failure compared to patients with autologous bone graft cranioplasties (P=0.629). Patients who had undergone PEEK cranioplasties had 0.127 times the odds of developing post-op complications (P=0.360) and 0.170 times the odds of implant failure compared to individuals who had undergone titanium mesh cranioplasties (P=0.168). The analysis was severely limited by the paucity in literature. However, there was a trend toward lower post-operative complication rates following PEEK cranioplasty versus autologous grafts, and lower implant failure rates with PEEK versus titanium mesh implants. Copyright © 2017. Published by Elsevier Ltd.

  9. Multivariate Generalized Multiscale Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Anne Humeau-Heurtier

    2016-11-01

    Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.

  10. Total abdominal hysterectomy versus minimal-invasive hysterectomy: a systemic review and meta-analysis

    International Nuclear Information System (INIS)

    Aragon Palmero, Felipe Jorge; Exposito Exposito, Moises

    2011-01-01

    INTRODUCTION. At the present time three types of hysterectomies are used: the vaginal hysterectomy and the minimal-invasive hysterectomy (MIH). The objective of present research was to compare the MIH and the total abdominal hysterectomy (TAH) in women presenting with benign uterine diseases. METHODS. A systemic review was made and a meta-analysis from the following databases: MEDLINE, EBSCO HOST AND The Cochrane Central Register of Controlled Trials. Only the controlled and randomized studies were selected. The data of all studies were combined and also the relative risk (RR) with a 95% CI was used with the Mantel-Haenszel method as an effect measure for dichotomy variables. For the analysis of continuing variables the mean difference was used. In all the comparisons performed the results were obtained with the fix effect and randomized forms. RESULTS. A total of 53 transoperative complications were registered in the MIH hysterectomy versus 17 in the TAH group (RR: 1,78; 95% CI: 1,04-3.05). Postoperative complications evolved in a similar way in both groups without significant differences from the statistical point of view. The blood losses, the hospital stay and the patient's reincorporation to usual and work activities were lesser in the laparoscopy group; however, the operative time is higher when it is compared with TAH (mean difference: 37,36; 95% CI: 34,36-39,93). CONCLUSIONS. Both techniques have advantages and disadvantages. The indication of MIH must to be individualized according to the clinical situation of each patient and these not to be performed in those centers without a properly trained surgical staff and with experience in advanced minimal invasive surgery. (author)

  11. Traditional cardiovascular risk factors and coronary collateral circulation: Protocol for a systematic review and meta-analysis of case-control studies.

    Science.gov (United States)

    Xing, Zhenhua; Pei, Junyu; Tang, Liang; Hu, Xinqun

    2018-04-01

    Well-developed coronary collateral circulation usually results in fewer infarct size, improved cardiac function, and fewer mortality. Traditional coronary risk factors (diabetes, hypertension, and smoking) have some effects on coronary collateral circulation. However, the association between these risk factors and coronary collateral circulation are controversial. Given the confusing evidences regarding traditional cardiovascular risk factors on coronary collateral circulation, we performed this meta-analysis protocol to investigate the relationship between traditional risk factors of coronary artery disease and coronary collateral circulation. MEDINE, EMBASE, and Science Citation Index will be searched to identify relevant studies. The primary outcomes of this meta-analysis are well-developed coronary collateral circulation. Meta-analysis was performed to calculate the odds ratio (OR) and 95% confidence interval (CI) of traditional coronary risk factors (diabetes, smoking, hypertriton). Pooled ORs were computed as the Mantel-Haenszel-weighted average of the ORs for all included studies. Sensitivity analysis, quality assessment, publication bias analysis, and the Grading of Recommendations Assessment, Development and Evaluation approach (GRADE) will be performed to ensure the reliability of our results. This study will provide a high-quality synthesis of current evidence of traditional risk factors on collateral circulation. This conclusion of our systematic review and meta-analysis will provide evidence to judge whether traditional risk factors affects coronary collateral circulation.Ethics and dissemination: Ethical approval is not required because our systematic review and meta-analysis will be based on published data without interventions on patients. The findings of this study will be published in a peer-reviewed journal.

  12. Angioplasty Guided by Intravascular Ultrasound: Meta-Analysis of Randomized Clinical Trials

    Energy Technology Data Exchange (ETDEWEB)

    Figueiredo, José Albuquerque Neto de, E-mail: jafneto@cardiol.br; Nogueira, Iara Antonia Lustosa [Universidade Federal do Maranhão, São Luiz, MA (Brazil); Figueiro, Mabel Fernandes; Buehler, Anna Maria; Berwanger, Otavio [Instituto de Ensino e Pesquisa do Hospital do Coração, São Paulo, SP (Brazil)

    2013-08-15

    The impact of intravascular ultrasound (IVUS) use on stenting has shown inconclusive results. Systematic review and meta-analysis of the impact of IVUS on stenting regarding the clinical and angiographic evolution. A search was performed in Medline/Pubmed, CENTRAL, Embase, Lilacs, Scopus and Web of Science databases. It included randomized clinical trials (RCTs) that evaluated the implantation of stents guided by IVUS, compared with those using angiography alone (ANGIO). The minimum follow-up duration was six months and the following outcomes were assessed: thrombosis, mortality, myocardial infarction, percutaneous and surgical revascularization, major adverse cardiovascular events (MACE) and restenosis. The binary outcomes were presented considering the number of events in each group; the estimates were generated by a random effects model, considering Mantel-Haenszel statistics as weighting agent and magnitude of effect for the relative risk (RR) with its respective 95% confidence interval (95%CI). Higgins I{sup 2} test was used to quantify the consistency between the results of each study. A total of 2,689 articles were evaluated, including 8 RCTs. There was a 27% reduction in angiographic restenosis (RR: 0.73, 95% CI: 0.54-0.97, I{sup 2} = 51%) and statistically significant reduction in the rates of percutaneous revascularization and overall (RR: 0.88; 95% CI: 0.51 to 1.53, I{sup 2} = 61%, RR: 0.73, 95% CI: 0.54 to 0.99, I{sup 2} = 55%), with no statistical difference in surgical revascularization (RR: 0.95, 95% CI: 0.52-1.74, I{sup 2} = 0%) in favor of IVUS vs. ANGIO. There were no differences regarding the other outcomes in the comparison between the two strategies. Angioplasty with stenting guided by IVUS decreases the rates of restenosis and revascularization, with no impact on MACE, acute myocardial infarction, mortality or thrombosis outcomes.

  13. Identifying group-sensitive physical activities: a differential item functioning analysis of NHANES data.

    Science.gov (United States)

    Gao, Yong; Zhu, Weimo

    2011-05-01

    The purpose of this study was to identify subgroup-sensitive physical activities (PA) using differential item functioning (DIF) analysis. A sub-unweighted sample of 1857 (men=923 and women=934) from the 2003-2004 National Health and Nutrition Examination Survey PA questionnaire data was used for the analyses. Using the Mantel-Haenszel, the simultaneous item bias test, and the ANOVA DIF methods, 33 specific leisure-time moderate and/or vigorous PA (MVPA) items were analyzed for DIF across race/ethnicity, gender, education, income, and age groups. Many leisure-time MVPA items were identified as large DIF items. When participating in the same amount of leisure-time MVPA, non-Hispanic blacks were more likely to participate in basketball and dance activities than non-Hispanic whites (NHW); NHW were more likely to participated in golf and hiking than non-Hispanic blacks; Hispanics were more likely to participate in dancing, hiking, and soccer than NHW, whereas NHW were more likely to engage in bicycling, golf, swimming, and walking than Hispanics; women were more likely to participate in aerobics, dancing, stretching, and walking than men, whereas men were more likely to engage in basketball, fishing, golf, running, soccer, weightlifting, and hunting than women; educated persons were more likely to participate in jogging and treadmill exercise than less educated persons; persons with higher incomes were more likely to engage in golf than those with lower incomes; and adults (20-59 yr) were more likely to participate in basketball, dancing, jogging, running, and weightlifting than older adults (60+ yr), whereas older adults were more likely to participate in walking and golf than younger adults. DIF methods are able to identify subgroup-sensitive PA and thus provide useful information to help design group-sensitive, targeted interventions for disadvantaged PA subgroups. © 2011 by the American College of Sports Medicine

  14. Are There Gender Differences in Emotion Comprehension? Analysis of the Test of Emotion Comprehension.

    Science.gov (United States)

    Fidalgo, Angel M; Tenenbaum, Harriet R; Aznar, Ana

    2018-01-01

    This article examines whether there are gender differences in understanding the emotions evaluated by the Test of Emotion Comprehension (TEC). The TEC provides a global index of emotion comprehension in children 3-11 years of age, which is the sum of the nine components that constitute emotion comprehension: (1) recognition of facial expressions, (2) understanding of external causes of emotions, (3) understanding of desire-based emotions, (4) understanding of belief-based emotions, (5) understanding of the influence of a reminder on present emotional states, (6) understanding of the possibility to regulate emotional states, (7) understanding of the possibility of hiding emotional states, (8) understanding of mixed emotions, and (9) understanding of moral emotions. We used the answers to the TEC given by 172 English girls and 181 boys from 3 to 8 years of age. First, the nine components into which the TEC is subdivided were analysed for differential item functioning (DIF), taking gender as the grouping variable. To evaluate DIF, the Mantel-Haenszel method and logistic regression analysis were used applying the Educational Testing Service DIF classification criteria. The results show that the TEC did not display gender DIF. Second, when absence of DIF had been corroborated, it was analysed for differences between boys and girls in the total TEC score and its components controlling for age. Our data are compatible with the hypothesis of independence between gender and level of comprehension in 8 of the 9 components of the TEC. Several hypotheses are discussed that could explain the differences found between boys and girls in the belief component. Given that the Belief component is basically a false belief task, the differences found seem to support findings in the literature indicating that girls perform better on this task.

  15. Efficacy of ibuprofen on prevention of high altitude headache: A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Juan Xiong

    Full Text Available Ibuprofen is used to prevent high altitude headache (HAH but its efficacy remains controversial. We conducted a systematic review and meta-analysis of randomized, placebo-controlled trials (RCTs of ibuprofen for the prevention of HAH.Studies reporting efficacy of ibuprofen for prevention of HAH were identified by searching electronic databases (until December 2016. The primary outcome was the difference in incidence of HAH between ibuprofen and placebo groups. Risk ratios (RR were aggregated using a Mantel-Haenszel random effect model. Heterogeneity of included trials was assessed using the I2 statistics.In three randomized-controlled clinical trials involving 407 subjects, HAH occurred in 101 of 239 subjects (42% who received ibuprofen and 96 of 168 (57% who received placebo (RR = 0.79, 95% CI 0.66 to 0.96, Z = 2.43, P = 0.02, I2 = 0%. The absolute risk reduction (ARR was 15%. Number needed to treat (NNT to prevent HAH was 7. Similarly, The incidence of severe HAH was significant in the two groups (RR = 0.40, 95% CI 0.17 to 0.93, Z = 2.14, P = 0.03, I2 = 0%. Severe HAH occurred in 3% treated with ibuprofen and 10% with placebo. The ARR was 8%. NNT to prevent severe HAH was 13. Headache severity using a visual analogue scale was not different between ibuprofen and placebo. Similarly, the difference between the two groups in the change in SpO2 from baseline to altitude was not different. One included RCT reported one participant with black stools and three participants with stomach pain in the ibuprofen group, while seven participants reported stomach pain in the placebo group.Based on a limited number of studies ibuprofen seems efficacious for the prevention of HAH and may therefore represent an alternative for preventing HAH with acetazolamide or dexamethasone.

  16. Generalized canonical correlation analysis with missing values

    NARCIS (Netherlands)

    M. van de Velden (Michel); Y. Takane

    2009-01-01

    textabstractTwo new methods for dealing with missing values in generalized canonical correlation analysis are introduced. The first approach, which does not require iterations, is a generalization of the Test Equating method available for principal component analysis. In the second approach,

  17. Clipping Versus Coiling in the Management of Posterior Communicating Artery Aneurysms with Third Nerve Palsy: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Gaberel, Thomas; Borha, Alin; di Palma, Camille; Emery, Evelyne

    2016-03-01

    To compare surgical clipping with endovascular coiling in terms of recovery from oculomotor nerve palsy (ONP) in the management of posterior communicating artery (PCoA) aneurysms causing third nerve palsy. We conducted a systematic review of the literature and meta-analysis. The meta-analysis included 11 relevant studies involving 384 patients with third nerve palsy caused by PCoA aneurysms at baseline, of whom 257 (67.0%) were treated by clipping and 127 were treated by coiling (33.0%). Pooled odds ratios of the impact of clipping or coiling on complete ONP recovery, lack of ONP recovery, and procedure-related death were calculated. The overall complete ONP recovery rate was 42.5% in the coiling group compared with 83.6% in the clipping group. The increase in complete ONP recovery in the clipping group corresponds to an overall pooled Mantel-Haenszel odds ratio of 4.44 (95% confidence interval = 1.66-11.84). Subgroup analysis revealed a clear benefit of clipping over coiling in patients with ruptured aneurysms, but not in patients with unruptured aneurysms. No procedure-related deaths were reported by any of the 11 studies. Surgical clipping of PCoA aneurysms causing third nerve palsy achieves better ONP recovery than endovascular coiling; this could be particularly true in the case of ruptured aneurysms. In view of the purely observational data, statements about this effect should be made with great caution. A randomized trial would better address the therapeutic dilemma, but pending the results of such a trial, we recommend treating PCoA aneurysms causing ONP with surgery. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Safety and efficacy of antibiotics compared with appendicectomy for treatment of uncomplicated acute appendicitis: meta-analysis of randomised controlled trials

    Science.gov (United States)

    Varadhan, Krishna K; Neal, Keith R

    2012-01-01

    Objective To compare the safety and efficacy of antibiotic treatment versus appendicectomy for the primary treatment of uncomplicated acute appendicitis. Design Meta-analysis of randomised controlled trials. Population Randomised controlled trials of adult patients presenting with uncomplicated acute appendicitis, diagnosed by haematological and radiological investigations. Interventions Antibiotic treatment versus appendicectomy. Outcome measures The primary outcome measure was complications. The secondary outcome measures were efficacy of treatment, length of stay, and incidence of complicated appendicitis and readmissions. Results Four randomised controlled trials with a total of 900 patients (470 antibiotic treatment, 430 appendicectomy) met the inclusion criteria. Antibiotic treatment was associated with a 63% (277/438) success rate at one year. Meta-analysis of complications showed a relative risk reduction of 31% for antibiotic treatment compared with appendicectomy (risk ratio (Mantel-Haenszel, fixed) 0.69 (95% confidence interval 0.54 to 0.89); I2=0%; P=0.004). A secondary analysis, excluding the study with crossover of patients between the two interventions after randomisation, showed a significant relative risk reduction of 39% for antibiotic therapy (risk ratio 0.61 (0.40 to 0.92); I2=0%; P=0.02). Of the 65 (20%) patients who had appendicectomy after readmission, nine had perforated appendicitis and four had gangrenous appendicitis. No significant differences were seen for treatment efficacy, length of stay, or risk of developing complicated appendicitis. Conclusion Antibiotics are both effective and safe as primary treatment for patients with uncomplicated acute appendicitis. Initial antibiotic treatment merits consideration as a primary treatment option for early uncomplicated appendicitis. PMID:22491789

  19. Clinical and Echocardiographic Outcomes Following Permanent Pacemaker Implantation After Transcatheter Aortic Valve Replacement: Meta-Analysis and Meta-Regression.

    Science.gov (United States)

    Mohananey, Divyanshu; Jobanputra, Yash; Kumar, Arnav; Krishnaswamy, Amar; Mick, Stephanie; White, Jonathon M; Kapadia, Samir R

    2017-07-01

    Transcatheter aortic valve replacement has become the procedure of choice for inoperable, high-risk, and many intermediate-risk patients with aortic stenosis. Conduction abnormalities are a common finding after transcatheter aortic valve replacement and often result in permanent pacemaker (PPM) implantation. Data pertaining to the clinical impact of PPM implantation are controversial. We used meta-analysis techniques to summarize the effect of PPM implantation on clinical and echocardiographic outcomes after transcatheter aortic valve replacement. Data were summarized as Mantel-Haenszel relative risk (RR) and 95% confidence intervals (CIs) for dichotomous variables and as standardized mean difference and 95% CI for continuous variables We used the Higgins I 2 statistic to evaluate heterogeneity. We found that patients with and without PPM have similar all-cause mortality (RR, 0.85; 95% CI, 0.70-1.03), cardiovascular mortality (RR, 0.84; 95% CI, 0.59-1.18), myocardial infarction (RR, 0.47; 95% CI, 0.20-1.11), and stroke (RR, 1.26; 95% CI, 0.70-2.26) at 30 days. The groups were also comparable in all-cause mortality (RR, 1.03; 95% CI, 0.92-1.16), cardiovascular mortality (RR, 0.69; 95% CI, 0.39-1.24), myocardial infarction (RR, 0.58; 95% CI, 0.30-1.13), and stroke (RR, 0.70; 95% CI, 0.47-1.04) at 1 year. We observed that the improvement in left ventricular ejection fraction was significantly greater in the patients without PPM (standardized mean difference, 0.22; 95% CI, 0.12-0.32). PPM implantation is not associated with increased risk of all-cause mortality, cardiovascular mortality, stroke, or myocardial infarction both at short- and long-term follow-up. However, PPM is associated with impaired left ventricular ejection fraction recovery post-transcatheter aortic valve replacement. © 2017 American Heart Association, Inc.

  20. Laparoscopic esophageal myotomy versus pneumatic dilation in the treatment of idiopathic achalasia: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Baniya R

    2017-09-01

    Full Text Available Ramkaji Baniya, Sunil Upadhaya, Jahangir Khan, Suresh Kumar Subedi, Tabrez Shaik Mohammed, Balvant K Ganatra, Ghassan Bachuwa Department of Internal Medicine, Hurley Medical Center, Michigan State University, Flint, MI, USA Background: Achalasia is a primary esophageal motility disorder of unknown etiology associated with abnormalities in peristalsis and lower esophageal sphincter relaxation. The disease is incurable; however, definitive treatment procedures like pneumatic dilation (PD/balloon dilation and laparoscopic esophageal myotomy (LEM are performed to relieve dysphagia and related symptoms. Currently, there is paucity of data comparing the outcomes of these procedures. The aim of this meta-analysis is to compare the short- and long-term success rates of PD and LEM. Methods: A thorough systematic search of PubMed, Scopus, clinicaltrials.gov, and Cochrane library was conducted for randomized controlled trials (RCTs comparing the outcomes of PD versus LEM in the treatment of achalasia. The Mantel-Haenszel method and random effect model were used to analyze the data. RCTs with outcome data at 3-month, 1-year, and 5-year intervals were analyzed. Results: A total of 437,378 and 254 patients at 3-month, 1-year, and 5-year intervals were analyzed for outcome data. At 3 months and 1 year, PD was not as effective as LEM (odds ratio [OR]: 0.50; confidence interval [CI] 0.31–0.82; P = 0.009 and OR: 0.47; CI 0.22–0.99; P = 0.21 but at 5 years, one procedure was non-inferior to the other (OR: 0.62; 0.33–1.19; P = 0.34. Conclusion: PD was as effective as LEM in relieving symptoms of achalasia in the long-term. Keywords: achalasia, balloon dilation, pneumatic dilation, laparoscopic myotomy, Heller’s myotomy

  1. Tidal volume and mortality in mechanically ventilated children: a systematic review and meta-analysis of observational studies*.

    Science.gov (United States)

    de Jager, Pauline; Burgerhof, Johannes G M; van Heerde, Marc; Albers, Marcel J I J; Markhorst, Dick G; Kneyber, Martin C J

    2014-12-01

    To determine whether tidal volume is associated with mortality in critically ill, mechanically ventilated children. MEDLINE, EMBASE, and CINAHL databases from inception until July 2013 and bibliographies of included studies without language restrictions. Randomized clinical trials and observational studies reporting mortality in mechanically ventilated PICU patients. Two authors independently selected studies and extracted data on study methodology, quality, and patient outcomes. Meta-analyses were performed using the Mantel-Haenszel random-effects model. Heterogeneity was quantified using I. Study quality was assessed using the Newcastle-Ottawa Score for cohort studies. Out of 142 citations, seven studies met the inclusion criteria, and additional two articles were identified from references of the found articles. One was excluded. These eight studies included 1,756 patients. Mortality rates ranged from 13% to 42%. There was no association between tidal volume and mortality when tidal volume was dichotomized at 7, 8, 10, or 12 mL/kg. Comparing patients ventilated with tidal volume less than 7 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg and tidal volume less than 8 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg also showed no association between tidal volume and mortality. Limiting the analysis to patients with acute lung injury/acute respiratory distress syndrome did not change these results. Heterogeneity was observed in all pooled analyses. A relationship between tidal volume and mortality in mechanically ventilated children could not be identified, irrespective of the severity of disease. The significant heterogeneity observed in the pooled analyses necessitates future studies in well-defined patient populations to understand the effects of tidal volume on patient outcome.

  2. Meta-analysis reveals association between most common class II haplotype in full-heritage Native Americans and rheumatoid arthritis.

    Science.gov (United States)

    Williams, R C; Jacobsson, L T; Knowler, W C; del Puente, A; Kostyu, D; McAuley, J E; Bennett, P H; Pettitt, D J

    1995-01-01

    The association of RA with the alleles at the HLA system was tested among Pima and Tohono O'odham Indians (Pimans) of the Gila River Indian Community of Arizona. Serologic class I (HLA-A, -B, and -C) alleles were typed in 51 individuals with RA and in 302 without RA. Serologic class II (HLA-DR, DQ; DR52 DR53) alleles were typed in a subset of 47 with RA and 147 without RA. Molecular subtypes of DR3X6, DRB1*1402, and *1406 were determined in 29 individuals, 16 with RA and 13 without RA. Among the cases with RA, 46 of 47 had the serologic antigen HLA-DR3X6, as did 140 of 147 of those without the disease. However, this association was not statistically significant because of the high prevalence of the antigen in the controls. Data from Pimans were analyzed with similar results from the Tlingit and Yakima Indians. A meta-analysis employing the Mantel-Haenszel procedure, stratified by tribe, revealed a statistically significant association between the most common haplotype, DRB1*1402 DQA1*0501 DQB1*0301 DRB3*0101, and RA (summary odds ratio = 2.63, 95% confidence interval = 1.08, 6.46). There was also a statistically significant difference in the genotype distributions of one class I locus, HLA-C, between those with and without RA (chi 2 = 12.4, 5 df; p = 0.03). It is concluded that the association with the most common class II haplotype in full-heritage Native Americans might help explain their high prevalence of RA.

  3. Risk of wound infection and safety profile of amoxicillin in healthy patients which required third molar surgery: a systematic review and meta-analysis.

    Science.gov (United States)

    Isiordia-Espinoza, M A; Aragon-Martinez, O H; Martínez-Morales, J F; Zapata-Morales, J R

    2015-11-01

    The aim of this systematic review and meta-analysis was to assess the risk of surgical wound infection and the adverse effects of amoxicillin in healthy patients who required excision of third molars. We identified eligible reports from searches of PubMed, Medline®, the Cochrane Library, Imbiomed, LILACS, and Google Scholar. Studies that met our minimum requirements were evaluated using inclusion and exclusion criteria and the Oxford Quality Scale. Those with a score of 3 or more on this Scale were included and their data were extracted and analysed. For evaluation of the risk of infection the absolute risk reduction, number needed to treat, and 95% CI were calculated. For evaluation of the risk of an adverse effect the absolute risk increase, number needed to harm, and 95% CI were calculated using the Risk Reduction Calculator. Each meta-analysis was made with the help of the Mantel-Haenszel random effects model, and estimates of risk (OR) and 95% CI were calculated using the Review Manager 5.3, from the Cochrane Library. A significant risk was assumed when the lower limit of the 95% CI was greater than 1. Probabilities of less than 0.05 were accepted as significant. The results showed that there was no reduction in the risk of infection when amoxicillin was given before or after operation compared with an untreated group or placebo. In conclusion, this study suggests that amoxicillin given prophylactically or postoperatively does not reduce the risk of infection in healthy patients having their third molars extracted. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  4. Efficacy and safety of solifenacin plus tamsulosin oral controlled absorption system in men with lower urinary tract symptoms: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Ming-Chao Li

    2015-02-01

    Full Text Available We performed a meta-analysis to compare treatment with a combination of solifenacin plus tamsulosin oral controlled absorption system (TOCAS with placebo or TOCAS monotherapy. The aim of the meta-analysis was to clarify the efficacy and safety of the combination treatments method for lower urinary tract symptoms (LUTS. We searched for trials of men with LUTS that were randomized to combination treatment compared with TOCAS monotherapy or placebo. We pooled data from three placebo-controlled trials meeting inclusion criteria. Primary outcomes of interest included changes in International Prostate Symptom Score (IPSS and urinary frequency. We also assessed postvoid residual, maximum urinary flow rate, incidence of urinary retention (UR, adverse events. Data were pooled using random or fixed effect models for continuous outcomes and the Mantel-Haenszel method to generate risk ratio. Reductions in IPSS storage subscore and total urgency and frequency score (TUFS were observed with solifenacin 6 mg plus TOCAS compared with placebo (P< 0.0001 and P< 0.0001, respectively. Reductions in IPSS storage subscore and TUFS were observed with solifenacin 9 mg plus TOCAS compared with placebo (P = 0.003 and P= 0.0006, respectively. Reductions in TUFS was observed with solifenacin 6 mg plus TOCAS compared with TOCAS (P = 0.01. Both combination treatments were well tolerated, with low incidence of UR. Solifenacin 6 mg plus TOCAS significantly improved total IPSS, storage and voiding symptoms compared with placebo. Solifenacin 6 mg plus TOCAS also improved storage symptoms compared with TOCAS alone. There was no additional benefit of solifenacin 9 mg compared with 6 mg when used in combination with TOCAS.

  5. Accuracy of Lung Ultrasonography versus Chest Radiography for the Diagnosis of Adult Community-Acquired Pneumonia: Review of the Literature and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Xiong Ye

    Full Text Available Lung ultrasonography (LUS is being increasingly utilized in emergency and critical settings. We performed a systematic review of the current literature to compare the accuracy of LUS and chest radiography (CR for the diagnosis of adult community-acquired pneumonia (CAP. We searched in Pub Med, EMBASE dealing with both LUS and CR for diagnosis of adult CAP, and conducted a meta-analysis to evaluate the diagnostic accuracy of LUS in comparison with CR. The diagnostic standard that the index test compared was the hospital discharge diagnosis or the result of chest computed tomography scan as a "gold standard". We calculated pooled sensitivity and specificity using the Mantel-Haenszel method and pooled diagnostic odds ratio using the DerSimonian-Laird method. Five articles met our inclusion criteria and were included in the final analysis. Using hospital discharge diagnosis as reference, LUS had a pooled sensitivity of 0.95 (0.93-0.97 and a specificity of 0.90 (0.86 to 0.94, CR had a pooled sensitivity of 0.77 (0.73 to 0.80 and a specificity of 0.91 (0.87 to 0.94. LUS and CR compared with computed tomography scan in 138 patients in total, the Z statistic of the two summary receiver operating characteristic was 3.093 (P = 0.002, the areas under the curve for LUS and CR were 0.901 and 0.590, respectively. Our study indicates that LUS can help to diagnosis adult CAP by clinicians and the accuracy was better compared with CR using chest computed tomography scan as the gold standard.

  6. Is there an increased risk of post-operative surgical site infection after orthopaedic surgery in HIV patients? A systematic review and meta-analysis.

    Science.gov (United States)

    Kigera, James W M; Straetemans, Masja; Vuhaka, Simplice K; Nagel, Ingeborg M; Naddumba, Edward K; Boer, Kimberly

    2012-01-01

    There is dilemma as to whether patients infected with the Human Immunodeficiency Virus (HIV) requiring implant orthopaedic surgery are at an increased risk for post-operative surgical site infection (SSI). We conducted a systematic review to determine the effect of HIV on the risk of post-operative SSI and sought to determine if this risk is altered by antibiotic use beyond 24 hours. We searched electronic databases, manually searched citations from relevant articles, and reviewed conference proceedings. The risk of postoperative SSI was pooled using Mantel-Haenszel method. We identified 18 cohort studies with 16 mainly small studies, addressing the subject. The pooled risk ratio of infection in the HIV patients when compared to non-HIV patients was 1.8 (95% Confidence Interval [CI] 1.3-2.4), in studies in Africa this was 2.3 (95% CI 1.5-3.5). In a sensitivity analysis the risk ratio was reduced to 1.4 (95% CI 0.5-3.8). The risk ratio of infection in patients receiving prolonged antibiotics compared to patients receiving antibiotics for up to 24 hours was 0.7 (95% CI 0.1-4.2). The results may indicate an increased risk in HIV infected patients but these results are not robust and inconclusive after conducting the sensitivity analysis removing poor quality studies. There is need for larger good quality studies to provide conclusive evidence. To better develop surgical protocols, further studies should determine the effect of reduced CD4 counts, viral load suppression and prolonged antibiotics on the risk for infection.

  7. The status of rheumatoid factor and anti-cyclic citrullinated peptide antibody are not associated with the effect of anti-TNFα agent treatment in patients with rheumatoid arthritis: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Qianwen Lv

    Full Text Available OBJECTIVES: This meta-analysis was conducted to investigate whether the status of rheumatoid factor (RF and anti-cyclic citrullinated peptide (anti-CCP antibody are associated with the clinical response to anti-tumor necrosis factor (TNF alpha treatment in rheumatoid arthritis (RA. METHODS: A systemic literature review was performed using the MEDLINE, SCOPUS, Cochrane Library, ISI Web of Knowledge, and Clinical Trials Register databases, and Hayden's criteria of quality assessment for prognostic studies were used to evaluate all of the studies. The correlation between the RF and anti-CCP antibody status with the treatment effect of anti-TNFα agents was analyzed separately using the Mantel Haenszel method. A fixed-effects model was used when there was no significant heterogeneity; otherwise, a random-effects model was applied. Publication bias was assessed using Egger's linear regression and a funnel plot. RESULTS: A total of 14 studies involving 5561 RA patients meeting the inclusion criteria were included. The overall analysis showed that the pooled relative risk for the predictive effects of the RF and anti-CCP antibody status on patient response to anti-TNFα agents was 0.98 (95% CI: 0.91-1.05, p=0.54 and 0.88 (95% CI: 0.76-1.03, p=0.11, respectively, with I(2 values of 43% (p=0.05 and 67% (p<0.01, respectively. Subgroup analyses of different anti-TNFα treatments (infliximab vs. etanercept vs. adalimumab vs. golimumab, response criteria (DAS28 vs. ACR20 vs. EULAR response, follow-up period (≥ 6 vs. <6 months, and ethnic group did not reveal a significant association for the status of RF and anti-CCP. CONCLUSIONS: Neither the RF nor anti-CCP antibody status in RA patients is associated with a clinical response to anti-TNFα treatment.

  8. A General Approach to Causal Mediation Analysis

    Science.gov (United States)

    Imai, Kosuke; Keele, Luke; Tingley, Dustin

    2010-01-01

    Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…

  9. Texture analysis using Renyi's generalized entropies

    NARCIS (Netherlands)

    Grigorescu, SE; Petkov, N

    2003-01-01

    We propose a texture analysis method based on Renyi's generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The

  10. Automatic movie skimming with general tempo analysis

    Science.gov (United States)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  11. General aviation air traffic pattern safety analysis

    Science.gov (United States)

    Parker, L. C.

    1973-01-01

    A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.

  12. Functional data analysis of generalized regression quantiles

    KAUST Repository

    Guo, Mengmeng

    2013-11-05

    Generalized regression quantiles, including the conditional quantiles and expectiles as special cases, are useful alternatives to the conditional means for characterizing a conditional distribution, especially when the interest lies in the tails. We develop a functional data analysis approach to jointly estimate a family of generalized regression quantiles. Our approach assumes that the generalized regression quantiles share some common features that can be summarized by a small number of principal component functions. The principal component functions are modeled as splines and are estimated by minimizing a penalized asymmetric loss measure. An iterative least asymmetrically weighted squares algorithm is developed for computation. While separate estimation of individual generalized regression quantiles usually suffers from large variability due to lack of sufficient data, by borrowing strength across data sets, our joint estimation approach significantly improves the estimation efficiency, which is demonstrated in a simulation study. The proposed method is applied to data from 159 weather stations in China to obtain the generalized quantile curves of the volatility of the temperature at these stations. © 2013 Springer Science+Business Media New York.

  13. Functional data analysis of generalized regression quantiles

    KAUST Repository

    Guo, Mengmeng; Zhou, Lan; Huang, Jianhua Z.; Hä rdle, Wolfgang Karl

    2013-01-01

    Generalized regression quantiles, including the conditional quantiles and expectiles as special cases, are useful alternatives to the conditional means for characterizing a conditional distribution, especially when the interest lies in the tails. We develop a functional data analysis approach to jointly estimate a family of generalized regression quantiles. Our approach assumes that the generalized regression quantiles share some common features that can be summarized by a small number of principal component functions. The principal component functions are modeled as splines and are estimated by minimizing a penalized asymmetric loss measure. An iterative least asymmetrically weighted squares algorithm is developed for computation. While separate estimation of individual generalized regression quantiles usually suffers from large variability due to lack of sufficient data, by borrowing strength across data sets, our joint estimation approach significantly improves the estimation efficiency, which is demonstrated in a simulation study. The proposed method is applied to data from 159 weather stations in China to obtain the generalized quantile curves of the volatility of the temperature at these stations. © 2013 Springer Science+Business Media New York.

  14. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  15. Contributions to sensitivity analysis and generalized discriminant analysis

    International Nuclear Information System (INIS)

    Jacques, J.

    2005-12-01

    Two topics are studied in this thesis: sensitivity analysis and generalized discriminant analysis. Global sensitivity analysis of a mathematical model studies how the output variables of this last react to variations of its inputs. The methods based on the study of the variance quantify the part of variance of the response of the model due to each input variable and each subset of input variables. The first subject of this thesis is the impact of a model uncertainty on results of a sensitivity analysis. Two particular forms of uncertainty are studied: that due to a change of the model of reference, and that due to the use of a simplified model with the place of the model of reference. A second problem was studied during this thesis, that of models with correlated inputs. Indeed, classical sensitivity indices not having significance (from an interpretation point of view) in the presence of correlation of the inputs, we propose a multidimensional approach consisting in expressing the sensitivity of the output of the model to groups of correlated variables. Applications in the field of nuclear engineering illustrate this work. Generalized discriminant analysis consists in classifying the individuals of a test sample in groups, by using information contained in a training sample, when these two samples do not come from the same population. This work extends existing methods in a Gaussian context to the case of binary data. An application in public health illustrates the utility of generalized discrimination models thus defined. (author)

  16. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  17. Transparent polyurethane film as an intravenous catheter dressing. A meta-analysis of the infection risks.

    Science.gov (United States)

    Hoffmann, K K; Weber, D J; Samsa, G P; Rutala, W A

    1992-04-15

    To obtain a quantitative estimate of the impact on infectious complications of using transparent dressings with intravenous catheters. Meta-analysis of all studies published in the English literature, including abstracts, letters, and reports that examined the primary research question of infection risks associated with transparent compared with gauze dressings for use on central and peripheral venous catheters. Studies were identified by use of the MEDLINE database using the indexing terms occlusive dressings, transparent dressings, and infection and by review of referenced bibliographies. Seven of the 15 studies (47%) of central venous catheters and seven of 12 studies (58%) of peripheral catheters met our inclusion criteria for analysis. All studies used a prospective cohort design, utilized hospitalized patients, and reported at least one of our defined outcomes. Data for each study were abstracted independently by three investigators. At least three studies were used in the analysis of each outcome. Applying a Mantel-Haenszel chi 2 analysis, use of transparent dressings on central venous catheters was significantly associated with an elevated relative risk (RR) of catheter tip infection (RR = 1.78; 95% confidence interval [CI], 1.38 to 2.30). Catheter-related sepsis (RR = 1.69; 95% CI, 0.97 to 2.95) and bacteremia (RR = 1.63; 95% CI, 0.76 to 3.47) were both associated with an elevated RR. Use of transparent dressings on peripheral catheters was associated with an elevated RR of catheter-tip infection (RR = 1.53; 95% CI, 1.18 to 1.99) but not phlebitis (RR = 1.02; 95% CI, 0.86 to 1.20), infiltration (RR = 1.12; 95% CI, 0.92 to 1.37), or skin colonization (RR = 0.99; 95% CI, 0.90 to 1.09). The results demonstrated a significantly increased risk of catheter-tip infection with the use of transparent compared with gauze dressings when used with either central or peripheral catheters. An increased risk of bacteremia and catheter sepsis associated with the use of

  18. Risk Factors for Acquired Rifamycin and Isoniazid Resistance: A Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Neesha Rockwood

    Full Text Available Studies looking at acquired drug resistance (ADR are diverse with respect to geographical distribution, HIV co-infection rates, retreatment status and programmatic factors such as regimens administered and directly observed therapy. Our objective was to examine and consolidate evidence from clinical studies of the multifactorial aetiology of acquired rifamycin and/or isoniazid resistance within the scope of a single systematic review. This is important to inform policy and identify key areas for further studies.Case-control and cohort studies and randomised controlled trials that reported ADR as an outcome during antitubercular treatment regimens including a rifamycin and examined the association of at least 1 risk factor were included. Post hoc, we carried out random effects Mantel-Haenszel weighted meta-analyses of the impact of 2 key risk factors 1 HIV and 2 baseline drug resistance on the binary outcome of ADR. Heterogeneity was assessed used I2 statistic. As a secondary outcome, we calculated median cumulative incidence of ADR, weighted by the sample size of the studies.Meta-analysis of 15 studies showed increased risk of ADR with baseline mono- or polyresistance (RR 4.85 95% CI 3.26 to 7.23, heterogeneity I2 58%, 95% CI 26 to 76%. Meta-analysis of 8 studies showed that HIV co-infection was associated with increased risk of ADR (RR 3.02, 95% CI 1.28 to 7.11; there was considerable heterogeneity amongst these studies (I2 81%, 95% CI 64 to 90%. Non-adherence, extrapulmonary/disseminated disease and advanced immunosuppression in HIV co-infection were other risk factors noted. The weighted median cumulative incidence of acquired multi drug resistance calculated in 24 studies (assuming whole cohort as denominator, regardless of follow up DST was 0.1% (5th to 95th percentile 0.07 to 3.2%.Baseline drug resistance and HIV co-infection were significant risk factors for ADR. There was a trend of positive association with non-adherence which is likely

  19. Harmonic Analysis Associated with the Generalized q-Bessel Operator

    Directory of Open Access Journals (Sweden)

    Ahmed Abouelaz

    2016-01-01

    Full Text Available In this article, we give a new harmonic analysis associated with the generalized q-Bessel operator. We introduce the generalized $q$-Bessel transform, the generalized q-Bessel translation and the generalized $q$-Bessel convolution product.

  20. Generalized indices for radiation risk analysis

    International Nuclear Information System (INIS)

    Bykov, A.A.; Demin, V.F.

    1989-01-01

    A new approach to ensuring nuclear safety has begun forming since the early eighties. The approach based on the probabilistic safety analysis, the principles of acceptable risk, the optimization of safety measures, etc. has forced a complex of adequate quantitative methods of assessment, safety analysis and risk management to be developed. The method of radiation risk assessment and analysis hold a prominent place in the complex. National and international research and regulatory organizations ICRP, IAEA, WHO, UNSCEAR, OECD/NEA have given much attention to the development of the conceptual and methodological basis of those methods. Some resolutions of the National Commission of Radiological Protection (NCRP) and the Problem Commission on Radiation Hygiene of the USSR Ministry of Health should be also noted. Both CBA (cost benefit analysis) and other methods of radiation risk analysis and safety management use a system of natural and socio-economic indices characterizing the radiation risk or damage. There exist a number of problems associated with the introduction, justification and use of these indices. For example, the price, a, of radiation damage, or collective dose unit, is a noteworthy index. The difficulties in its qualitative and quantitative determination are still an obstacle for a wide application of CBA to the radiation risk analysis and management. During recent 10-15 years these problems have been a subject of consideration for many authors. The present paper also considers the issues of the qualitative and quantitative justification of the indices of radiation risk analysis

  1. Cardiovascular and heart failure safety profile of vildagliptin: a meta-analysis of 17 000 patients.

    Science.gov (United States)

    McInnes, G; Evans, M; Del Prato, S; Stumvoll, M; Schweizer, A; Lukashevich, V; Shao, Q; Kothny, W

    2015-11-01

    To report the cardiovascular (CV) safety profile and heart failure (HF) risk of vildagliptin from a large pool of studies, including trials in high-risk patients with type 2 diabetes mellitus (T2DM), such as those with congestive HF and/or moderate/severe renal impairment. We conducted a retrospective meta-analysis of prospectively adjudicated CV events. Patient-level data were pooled from 40 double-blind, randomized controlled phase III and IV vildagliptin studies. The primary endpoint was occurrence of major adverse CV events (MACEs; myocardial infarction, stroke and CV death). Assessments of the individual MACE components and HF events (requiring hospitalization or new onset) were secondary endpoints. The risk ratio (RR) of vildagliptin (50 mg once- and twice-daily combined) versus comparators (placebo and all non-vildagliptin treatments) was calculated using the Mantel-Haenszel (M-H) method. Of the 17 446 patients, 9599 received vildagliptin (9251.4 subject-years of exposure) and 7847 received comparators (7317.0 subject-years of exposure). The mean age of the patients was 57 years, body mass index 30.5 kg/m(2) (nearly 50% obese), glycated haemoglobin concentration 8.1% and T2DM duration 5.5 years. A MACE occurred in 83 (0.86%) vildagliptin-treated patients and 85 (1.20%) comparator-treated patients, with an M-H RR of 0.82 [95% confidence interval (CI) 0.61-1.11]. Similar RRs were observed for the individual events. Confirmed HF events were reported in 41 (0.43%) vildagliptin-treated patients and 32 (0.45%) comparator-treated patients, with an M-H RR 1.08 (95% CI 0.68-1.70). This large meta-analysis indicates that vildagliptin is not associated with an increased risk of adjudicated MACEs relative to comparators. Moreover, this analysis did not find a significant increased risk of HF in vildagliptin-treated patients. © 2015 John Wiley & Sons Ltd.

  2. Oral versus intravenous methylprednisolone for the treatment of multiple sclerosis relapses: A meta-analysis of randomized controlled trials.

    Directory of Open Access Journals (Sweden)

    Shuo Liu

    Full Text Available Intravenous glucocorticoids are recommended for multiple sclerosis (MS. However, they can be inconvenient and expensive. Due to their convenience and low cost, oral glucocorticoids may be an alternative treatment. Recently, several studies have shown that there is no difference in efficacy and safety between oral methylprednisolone (oMP and intravenous methylprednisolone (ivMP.We sought to assess the clinical efficacy, safety and tolerability of oral methylprednisolone versus intravenous methylprednisolone for MS relapses in this meta-analysis.Randomized controlled trials (RCTs evaluating the clinical efficacy, safety and tolerability of oral methylprednisolone versus intravenous methylprednisolone for MS relapses were searched in PubMed, Cochrane Library, Medline, EMBASE and China Biology Medicine until October 25, 2016, without language restrictions. The proportion of patients who had improved by day 28 was chosen as the efficacy outcome. We chose the risk ratio (RR to analyze each trial with the 95% confidence interval (95% CI. We also used the fixed-effects model (Mantel-Haenszel approach to calculate the pooled relative effect estimates.A total of 5 trials were identified, which included 369 patients. The results of our meta-analysis revealed that no significant difference existed in relapse improvement at day 28 between oMP and ivMP (RR 0.96, 95% CI 0.84 to 1.10. No evidence of heterogeneity existed among the trials (P = 0.45, I2 = 0%. Both treatments were equally safe and well tolerated except that insomnia was more likely to occur in the oMP group compared to the ivMP group.Our meta-analysis reveals strong evidence that oMP is not inferior to ivMP in increasing the proportion of patients experiencing clinical improvement at day 28. In addition, both routes of administration are equally well tolerated and safe. These findings suggest that we may be able to replace ivMP with oMP to treat MS relapses.

  3. Systematic review and meta-analysis of the sero-epidemiological association between Epstein Barr virus and multiple sclerosis.

    Directory of Open Access Journals (Sweden)

    Yahya H Almohmeed

    Full Text Available BACKGROUND: A role for Epstein Barr virus (EBV in multiple sclerosis (MS has been postulated. Previous systematic reviews found higher prevalences of anti-EBV antibodies in MS patients compared to controls, but many studies have since been published, and there is a need to apply more rigorous systematic review methods. METHODOLOGY/PRINCIPAL FINDINGS: We examined the link between EBV and MS by conducting a systematic review and meta-analysis of case-control and cohort studies that examined the prevalence of anti-EBV antibodies in the serum of cases and controls. We searched Medline and Embase databases from 1960 to 2012, with no language restriction. The Mantel-Haenszel odds ratios (OR for anti-EBV antibodies sero-positivity were calculated, and meta-analysis conducted. Quality assessment was performed using a modified version of the Newcastle Ottawa scale. Thirty-nine studies were included. Quality assessment found most studies reported acceptable selection and comparability of cases and controls. However the majority had poor reporting of ascertainment of exposure. Most studies found a higher sero-prevalence of anti-EBNA IgG and anti-VCA IgG in cases compared to controls. The results for anti-EA IgG were mixed with only half the studies finding a higher sero-prevalence in cases. The meta-analysis showed a significant OR for sero-positivity to anti-EBNA IgG and anti-VCA IgG in MS cases (4.5 [95% confidence interval (CI 3.3 to 6.6, p<0.00001] and 4.5 [95% CI 2.8 to 7.2, p<0.00001] respectively. However, funnel plot examination suggested publication bias for the reporting of the anti-EBNA IgG. No significant difference in the OR for sero-positivity to anti-EA IgG was found (1.4 [95% CI 0.9 to 2.1, p = 0.09]. CONCLUSION/SIGNIFICANCE: These findings support previous systematic reviews, however publication bias cannot be excluded. The methodological conduct of studies could be improved, particularly with regard to reporting and conduct of

  4. A meta-analysis of the prevalence of toxoplasma gondii igg antibodies in patients with mental disorders

    Directory of Open Access Journals (Sweden)

    Angela Dragomir1,

    2016-12-01

    Full Text Available OBJECTIVES AND BACKGROUND Toxoplasma gondii infection has been recently associated with schizophrenia and other psychiatric disorders. The aim of the present study was to evaluate the prevalence of T. gondii infection in psychiatric patients by using meta-analytical methods. MATERIALS AND METHODS By systematic research of PUBMED Database, we identified several articles on this issue. We included casecontrol studies focused on the seroprevalence of T. gondii (IgG antibodies in patients with psychiatric disorders and healthy controls, published over the past 10 years R3.2.2. free software for statistical computing and graphics was used to perform the meta-analysis. Data were pooled using a random effects model and Mantel-Haenszel method. RESULTS The PUBMED Database showed references to only seven scientific papers that investigated the prevalence of T. gondii IgG antibodies in psychiatric patients. Six hundred seventy three patients and seven hundred seventy four controls coped with the inclusion criteria and were used in our analysis. We found a significant increase of T. gondii IgG antibodies in patients with schizophrenia and other psychiatric disorders compared with controls (41.6% vs 24.54%, OR = 2.16, 95% CI = [1.45-3.21], P = .001. CONCLUSIONS An increased seroprevalence of T. gondii IgG antibodies has been reported in psychiatric patients. Our study suggests that T. gondii infection may be relevant in order to determine and understand the complex etiology of schizophrenia and other psychiatric disorders. Graphical abstract: Meta-analysis of the prevalence of T. gondii IgG antibodies in patients with mental disorders REFERENCES 1. Hamidinejat H, Ghorbanpoor M, Hosseini H, Alavi SM, Nabavi L, Jalali MH, Borojeni MP, Jafari H, Mohammadaligol S. Toxoplasma gondii infection in first-episode and inpatient individuals with schizophrenia. Int J Infect Dis. 2010;14:e978-81. doi: 10.1016/j.ijid.2010.05.018. 2. Alvarado-Esquivel C, Carrillo-Oropeza D

  5. Systematic Review of Infrapopliteal Drug-Eluting Stents: A Meta-Analysis of Randomized Controlled Trials

    Energy Technology Data Exchange (ETDEWEB)

    Katsanos, Konstantinos, E-mail: katsanos@med.upatras.gr [NHS Foundation Trust, King' s Health Partners, Department of Interventional Radiology, Guy' s and St. Thomas' Hospitals (United Kingdom); Spiliopoulos, Stavros [Patras University Hospital, Department of Interventional Radiology, School of Medicine (Greece); Diamantopoulos, Athanasios [NHS Foundation Trust, King' s Health Partners, Department of Interventional Radiology, Guy' s and St. Thomas' Hospitals (United Kingdom); Karnabatidis, Dimitris [Patras University Hospital, Department of Interventional Radiology, School of Medicine (Greece); Sabharwal, Tarun [NHS Foundation Trust, King' s Health Partners, Department of Interventional Radiology, Guy' s and St. Thomas' Hospitals (United Kingdom); Siablis, Dimitris [Patras University Hospital, Department of Interventional Radiology, School of Medicine (Greece)

    2013-06-15

    IntroductionDrug-eluting stents (DES) have been proposed for the treatment of infrapopliteal arterial disease. We performed a systematic review to provide a qualitative analysis and quantitative data synthesis of randomized controlled trials (RCTs) assessing infrapopliteal DES.Materials and MethodsPubMed (Medline), EMBASE (Excerpta Medical Database), AMED (Allied and Complementary medicine Database), Scopus, CENTRAL (Cochrane Central Register of Controlled Trials), online content, and abstract meetings were searched in September 2012 for eligible RCTs according to the preferred reporting items for systematic reviews and meta-analyses selection process. Risk of bias was assessed using the Cochrane Collaboration's tool. Primary endpoint was primary patency defined as absence of {>=}50 % vessel restenosis at 1 year. Secondary outcome measures included patient survival, limb amputations, change of Rutherford-Becker class, target lesion revascularization (TLR) events, complete wound healing, and event-free survival at 1 year. Risk ratio (RRs) were calculated using the Mantel-Haenszel fixed effects model, and number-needed-to-treat values are reported.ResultsThree RCTs involving 501 patients with focal infrapopliteal lesions were analyzed (YUKON-BTX, DESTINY, and ACHILLES trials). All three RCTs included relatively short and focal infrapopliteal lesions. At 1 year, there was clear superiority of infrapopliteal DES compared with control treatments in terms of significantly higher primary patency (80.0 vs. 58.5 %; pooled RR = 1.37, 95 % confidence interval [CI] = 1.18-1.58, p < 0.0001; number-needed-to-treat (NNT) value = 4.8), improvement of Rutherford-Becker class (79.0 vs. 69.6 %; pooled RR = 1.13, 95 % CI = 1.002-1.275, p = 0.045; NNT = 11.1), decreased TLR events (9.9 vs. 22.0 %; pooled RR = 0.45, 95 % CI = 0.28-0.73, p = 0.001; NNT = 8.3), improved wound healing (76.8 vs. 59.7 %; pooled RR = 1.29, 95 % CI = 1.02-1.62, p = 0.04; NNT = 5.9), and better overall

  6. E-cigarettes and smoking cessation: evidence from a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Muhammad Aziz Rahman

    Full Text Available E-cigarettes are currently being debated regarding their possible role in smoking cessation and as they are becoming increasingly popular, the research to date requires investigation.To investigate whether the use of e-cigarettes is associated with smoking cessation or reduction, and whether there is any difference in efficacy of e-cigarettes with and without nicotine on smoking cessation.A systematic review of articles with no limit on publication date was conducted by searching PubMed, Web of Knowledge and Scopus databases.Published studies, those reported smoking abstinence or reduction in cigarette consumption after the use of e-cigarettes, were included. Studies were systematically reviewed, and meta-analyses were conducted using Mantel-Haenszel fixed-effect and random-effects models. Degree of heterogeneity among studies and quality of the selected studies were evaluated.Six studies were included involving 7,551 participants. Meta-analyses included 1,242 participants who had complete data on smoking cessation. Nicotine filled e-cigarettes were more effective for cessation than those without nicotine (pooled Risk Ratio 2.29, 95%CI 1.05-4.97. Amongst 1,242 smokers, 224 (18% reported smoking cessation after using nicotine-enriched e-cigarettes for a minimum period of six months. Use of such e-cigarettes was positively associated with smoking cessation with a pooled Effect Size of 0.20 (95%CI 0.11-0.28. Use of e-cigarettes was also associated with a reduction in the number of cigarettes used.Included studies were heterogeneous, due to different study designs and gender variation. Whilst we were able to comment on the efficacy of nicotine vs. non-nicotine e-cigarettes for smoking cessation, we were unable to comment on the efficacy of e-cigarettes vs. other interventions for cessation, given the lack of comparator groups in the studies included in this meta-analysis.Use of e-cigarettes is associated with smoking cessation and reduction. More

  7. E-cigarettes and smoking cessation: evidence from a systematic review and meta-analysis.

    Science.gov (United States)

    Rahman, Muhammad Aziz; Hann, Nicholas; Wilson, Andrew; Mnatzaganian, George; Worrall-Carter, Linda

    2015-01-01

    E-cigarettes are currently being debated regarding their possible role in smoking cessation and as they are becoming increasingly popular, the research to date requires investigation. To investigate whether the use of e-cigarettes is associated with smoking cessation or reduction, and whether there is any difference in efficacy of e-cigarettes with and without nicotine on smoking cessation. A systematic review of articles with no limit on publication date was conducted by searching PubMed, Web of Knowledge and Scopus databases. Published studies, those reported smoking abstinence or reduction in cigarette consumption after the use of e-cigarettes, were included. Studies were systematically reviewed, and meta-analyses were conducted using Mantel-Haenszel fixed-effect and random-effects models. Degree of heterogeneity among studies and quality of the selected studies were evaluated. Six studies were included involving 7,551 participants. Meta-analyses included 1,242 participants who had complete data on smoking cessation. Nicotine filled e-cigarettes were more effective for cessation than those without nicotine (pooled Risk Ratio 2.29, 95%CI 1.05-4.97). Amongst 1,242 smokers, 224 (18%) reported smoking cessation after using nicotine-enriched e-cigarettes for a minimum period of six months. Use of such e-cigarettes was positively associated with smoking cessation with a pooled Effect Size of 0.20 (95%CI 0.11-0.28). Use of e-cigarettes was also associated with a reduction in the number of cigarettes used. Included studies were heterogeneous, due to different study designs and gender variation. Whilst we were able to comment on the efficacy of nicotine vs. non-nicotine e-cigarettes for smoking cessation, we were unable to comment on the efficacy of e-cigarettes vs. other interventions for cessation, given the lack of comparator groups in the studies included in this meta-analysis. Use of e-cigarettes is associated with smoking cessation and reduction. More randomised

  8. Mutation analysis of the ERCC4/FANCQ gene in hereditary breast cancer.

    Directory of Open Access Journals (Sweden)

    Sandra Kohlhase

    Full Text Available The ERCC4 protein forms a structure-specific endonuclease involved in the DNA damage response. Different cancer syndromes such as a subtype of Xeroderma pigmentosum, XPF, and recently a subtype of Fanconi Anemia, FA-Q, have been attributed to biallelic ERCC4 gene mutations. To investigate whether monoallelic ERCC4 gene defects play some role in the inherited component of breast cancer susceptibility, we sequenced the whole ERCC4 coding region and flanking untranslated portions in a series of 101 Byelorussian and German breast cancer patients selected for familial disease (set 1, n = 63 or for the presence of the rs1800067 risk haplotype (set 2, n = 38. This study confirmed six known and one novel exonic variants, including four missense substitutions but no truncating mutation. Missense substitution p.R415Q (rs1800067, a previously postulated breast cancer susceptibility allele, was subsequently screened for in a total of 3,698 breast cancer cases and 2,868 controls from Germany, Belarus or Russia. The Gln415 allele appeared protective against breast cancer in the German series, with the strongest effect for ductal histology (OR 0.67; 95%CI 0.49; 0.92; p = 0.003, but this association was not confirmed in the other two series, with the combined analysis yielding an overall Mantel-Haenszel OR of 0.94 (95% CI 0.81; 1.08. There was no significant effect of p.R415Q on breast cancer survival in the German patient series. The other three detected ERCC4 missense mutations included two known rare variants as well as a novel substitution, p.E17V, that we identified on a p.R415Q haplotype background. The p.E17V mutation is predicted to be probably damaging but was present in just one heterozygous patient. We conclude that the contribution of ERCC4/FANCQ coding mutations to hereditary breast cancer in Central and Eastern Europe is likely to be small.

  9. E-Cigarettes and Smoking Cessation: Evidence from a Systematic Review and Meta-Analysis

    Science.gov (United States)

    Rahman, Muhammad Aziz; Hann, Nicholas; Wilson, Andrew; Mnatzaganian, George; Worrall-Carter, Linda

    2015-01-01

    Background E-cigarettes are currently being debated regarding their possible role in smoking cessation and as they are becoming increasingly popular, the research to date requires investigation. Objectives To investigate whether the use of e-cigarettes is associated with smoking cessation or reduction, and whether there is any difference in efficacy of e-cigarettes with and without nicotine on smoking cessation. Data Sources A systematic review of articles with no limit on publication date was conducted by searching PubMed, Web of Knowledge and Scopus databases. Methods Published studies, those reported smoking abstinence or reduction in cigarette consumption after the use of e-cigarettes, were included. Studies were systematically reviewed, and meta-analyses were conducted using Mantel-Haenszel fixed-effect and random-effects models. Degree of heterogeneity among studies and quality of the selected studies were evaluated. Results Six studies were included involving 7,551 participants. Meta-analyses included 1,242 participants who had complete data on smoking cessation. Nicotine filled e-cigarettes were more effective for cessation than those without nicotine (pooled Risk Ratio 2.29, 95%CI 1.05-4.97). Amongst 1,242 smokers, 224 (18%) reported smoking cessation after using nicotine-enriched e-cigarettes for a minimum period of six months. Use of such e-cigarettes was positively associated with smoking cessation with a pooled Effect Size of 0.20 (95%CI 0.11-0.28). Use of e-cigarettes was also associated with a reduction in the number of cigarettes used. Limitations Included studies were heterogeneous, due to different study designs and gender variation. Whilst we were able to comment on the efficacy of nicotine vs. non-nicotine e-cigarettes for smoking cessation, we were unable to comment on the efficacy of e-cigarettes vs. other interventions for cessation, given the lack of comparator groups in the studies included in this meta-analysis. Conclusions Use of e

  10. Generalized Path Analysis and Generalized Simultaneous Equations Model for Recursive Systems with Responses of Mixed Types

    Science.gov (United States)

    Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang

    2006-01-01

    This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…

  11. Impaired glucose tolerance, type 2 diabetes and metabolic syndrome in polycystic ovary syndrome: a systematic review and meta-analysis.

    Science.gov (United States)

    Moran, Lisa J; Misso, Marie L; Wild, Robert A; Norman, Robert J

    2010-01-01

    BACKGROUND Polycystic ovary syndrome (PCOS) is a common condition in reproductive-aged women associated with impaired glucose tolerance (IGT), type 2 diabetes mellitus (DM2) and the metabolic syndrome. METHODS A literature search was conducted (MEDLINE, CINAHL, EMBASE, clinical trial registries and hand-searching) identifying studies reporting prevalence or incidence of IGT, DM2 or metabolic syndrome in women with and without PCOS. Data were presented as odds ratio (OR) [95% confidence interval (CI)] with fixed- and random-effects meta-analysis by Mantel-Haenszel methods. Quality testing was based on Newcastle-Ottawa Scaling and The Cochrane Collaboration's risk of bias assessment tool. Literature searching, data abstraction and quality appraisal were performed by two investigators. RESULTS A total of 2192 studies were reviewed and 35 were selected for final analysis. Women with PCOS had increased prevalence of IGT (OR 2.48, 95% CI 1.63, 3.77; BMI-matched studies OR 2.54, 95% CI 1.44, 4.47), DM2 (OR 4.43, 95% CI 4.06, 4.82; BMI-matched studies OR 4.00, 95% CI 1.97, 8.10) and metabolic syndrome (OR 2.88, 95% CI 2.40, 3.45; BMI-matched studies OR 2.20, 95% CI 1.36, 3.56). One study assessed IGT/DM2 incidence and reported no significant differences in DM2 incidence (OR 2.07, 95% CI 0.68, 6.30). One study assessed conversion from normal glucose tolerance to IGT/DM2 (OR 2.4, 95% CI 0.7, 8.0). No studies reported metabolic syndrome incidence. CONCLUSIONS Women with PCOS had an elevated prevalence of IGT, DM2 and metabolic syndrome in both BMI and non-BMI-matched studies. Few studies have determined IGT/DM2 or metabolic syndrome incidence in women with and without PCOS and further research is required.

  12. How good is endoscopic ultrasound for TNM staging of gastric cancers? A meta-analysis and systematic review

    Institute of Scientific and Technical Information of China (English)

    Srinivas Reddy Puli; Jyotsna Batapati Krishna Reddy; Matthew L Bechtold; Mainor R Antillon; Jamal A Ibdah

    2008-01-01

    AIM: To evaluate the accuracy of endoscopic ultrasound (EUS) for staging of gastric cancers.METHODS: Only EUS studies confirmed by surgery were selected. Only studies from which a 2×2 table could be constructed for true positive, false negative, false positive and true negative values were included. Articles were searched in Medline, Pubmed, Ovid journals, Cumulative index for nursing & allied health literature, International pharmaceutical abstracts, old Medline, Medline nonindexed citations, and Cochrane control trial registry. Two reviewers independently searched and extracted data. The differences were resolved by mutual agreement.2×2 tables were constructed with the data extracted from each study. Meta-analysis for the accuracy of EUS was analyzed by calculating pooled estimates of sensitivity, specificity, likelihood ratios, and diagnostic odds ratio. Pooling was conducted by both the Mantel-Haenszel method (fixed effects model) and DerSimonian Laird method (random effects model). The heterogeneity of studies was tested using Cochran's Q test based upon inverse variance weights.RESULTS: Initial search identified 1620 reference articles and of these, 376 relevant articles were selected and reviewed. Twenty-two studies (n=1896) which met the inclusion criteria were included in this analysis. Pooled sensitivity of T1 was 88.1% (95% CI:84.5-91.1) and T2 was 82.3% (95% CI: 78.2-86.0). For T3, pooled sensitivity was 89.7% (95% CI:87.1-92.0). T4 had a pooled sensitivity of 99.2% (95% CI: 97.1-99.9). For nodal staging, the pooled sensitivity for N1 was 58.2% (95% CI: 53.5-62.8) and N2 was 64.9% (95% CI: 60.8-68.8). Pooled sensitivity to diagnose distant metastasis was 73.2% (95% CI: 63.2-81.7). The P for chi-squared heterogeneity for all the pooled accuracy estimates was>0.10.CONCLUSION: EUS results are more accurate with advanced disease than early disease. If EUS diagnoses advanced disease, such as T4 disease, the patient is 500 times more likely to have true

  13. Associations between Methylenetetrahydrofolate Reductase (MTHFR Polymorphisms and Non-Alcoholic Fatty Liver Disease (NAFLD Risk: A Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Man-Yi Sun

    Full Text Available C677T and A1298C are the most common allelic variants of Methylenetetrahydrofolate Reductase (MTHFR gene. The association between MTHFR polymorphisms and the occurrence of non-alcoholic fatty liver disease (NAFLD remains controversial. This study was thus performed to examine whether MTHFR mutations are associated with the susceptibility to NAFLD.A first meta-analysis on the association between the MTHFR polymorphisms and NAFLD risks was carried out via Review Manager 5.0 and Stata/SE 12.0 software. The on-line databases, such as PubMed, EMBASE, CENTRAL, WOS, Scopus and EBSCOhost (updated to April 1st, 2016, were searched for eligible case-control studies. The odd radio (OR, 95% confidence interval (CI and P value were calculated through Mantel-Haenszel statistics under random- or fixed-effect model.Eight articles (785 cases and 1188 controls contributed data to the current meta-analysis. For C677T, increased NAFLD risks were observed in case group under homozygote model (T/T vs C/C, OR = 1.49, 95% CI = 1.03~2.15, P = 0.04 and recessive model (T/T vs C/C+C/T, OR = 1.42, 95% CI = 1.07~1.88, P = 0.02, but not the other genetics models, compared with control group. For A1298C, significantly increased NAFLD risks were detected in allele model (C vs A, OR = 1.53, 95% CI = 1.13~2.07, P = 0.006, homozygote model (C/C vs A/A, OR = 2.81, 95% CI = 1.63~4.85, P = 0.0002, dominant model (A/C+C/C vs A/A, OR = 1.60, 95% CI = 1.06~2.41, P = 0.03 and recessive model (C/C vs A/A+A/C, OR = 2.08, 95% CI = 1.45~3.00, P<0.0001, but not heterozygote model.T/T genotype of MTHFR C677T polymorphism and C/C genotype of MTHFR A1298C are more likely to be associated with the susceptibility to NAFLD.

  14. Associations between Methylenetetrahydrofolate Reductase (MTHFR) Polymorphisms and Non-Alcoholic Fatty Liver Disease (NAFLD) Risk: A Meta-Analysis

    Science.gov (United States)

    Sun, Man-Yi; Zhang, Li; Shi, Song-Li; Lin, Jing-Na

    2016-01-01

    Background C677T and A1298C are the most common allelic variants of Methylenetetrahydrofolate Reductase (MTHFR) gene. The association between MTHFR polymorphisms and the occurrence of non-alcoholic fatty liver disease (NAFLD) remains controversial. This study was thus performed to examine whether MTHFR mutations are associated with the susceptibility to NAFLD. Methods A first meta-analysis on the association between the MTHFR polymorphisms and NAFLD risks was carried out via Review Manager 5.0 and Stata/SE 12.0 software. The on-line databases, such as PubMed, EMBASE, CENTRAL, WOS, Scopus and EBSCOhost (updated to April 1st, 2016), were searched for eligible case-control studies. The odd radio (OR), 95% confidence interval (CI) and P value were calculated through Mantel-Haenszel statistics under random- or fixed-effect model. Results Eight articles (785 cases and 1188 controls) contributed data to the current meta-analysis. For C677T, increased NAFLD risks were observed in case group under homozygote model (T/T vs C/C, OR = 1.49, 95% CI = 1.03~2.15, P = 0.04) and recessive model (T/T vs C/C+C/T, OR = 1.42, 95% CI = 1.07~1.88, P = 0.02), but not the other genetics models, compared with control group. For A1298C, significantly increased NAFLD risks were detected in allele model (C vs A, OR = 1.53, 95% CI = 1.13~2.07, P = 0.006), homozygote model (C/C vs A/A, OR = 2.81, 95% CI = 1.63~4.85, P = 0.0002), dominant model (A/C+C/C vs A/A, OR = 1.60, 95% CI = 1.06~2.41, P = 0.03) and recessive model (C/C vs A/A+A/C, OR = 2.08, 95% CI = 1.45~3.00, P<0.0001), but not heterozygote model. Conclusion T/T genotype of MTHFR C677T polymorphism and C/C genotype of MTHFR A1298C are more likely to be associated with the susceptibility to NAFLD. PMID:27128842

  15. Patients Receiving Prebiotics and Probiotics Before Liver Transplantation Develop Fewer Infections Than Controls: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Sawas, Tarek; Al Halabi, Shadi; Hernaez, Ruben; Carey, William D; Cho, Won Kyoo

    2015-09-01

    Among patients who have received liver transplants, infections increase morbidity and mortality and prolong hospital stays. Administration of antibiotics and surgical trauma create intestinal barrier dysfunction and microbial imbalances that allow enteric bacteria to translocate to the blood. Probiotics are believed to prevent bacterial translocation by stabilizing the intestinal barrier and stimulating proliferation of the intestinal epithelium, mucus secretion, and motility. We performed a meta-analysis to determine the effects of probiotics on infections in patients receiving liver transplants. We searched PubMed and EMBASE for controlled trials that evaluated the effects of prebiotics and probiotics on infections in patients who underwent liver transplantation. Heterogeneity was analyzed by the Cochran Q statistic. Pooled Mantel-Haenszel relative risks were calculated with a fixed-effects model. We identified 4 controlled studies, comprising 246 participants (123 received probiotics, 123 served as controls), for inclusion in the meta-analysis. In these studies, the intervention groups received enteric nutrition and fiber (prebiotics) with probiotics, and the control groups received only enteric nutrition and fiber without probiotics. The infection rate was 7% in groups that received probiotics vs 35% in control groups (relative risk [RR], 0.21; 95% confidence interval [CI], 0.11-0.41; P = .001). The number needed to treat to prevent 1 infection was 3.6. In subgroup analyses, only 2% of subjects in the probiotic groups developed urinary tract infections, compared with 16% of controls (RR, 0.14; 95% CI, 0.04-0.47; P probiotic groups developed intra-abdominal infections, compared with 11% of controls (RR, 0.27; 95% CI, 0.09-0.78; P = .02). Subjects receiving probiotics also had shorter stays in the hospital than controls (mean difference, 1.41 d; P probiotics and prebiotics before, or on the day of, liver transplantation reduces the rate of infection after

  16. Characteristics and heterogeneity of schizoaffective disorder compared with unipolar depression and schizophrenia - a systematic literature review and meta-analysis.

    Science.gov (United States)

    Rink, Lena; Pagel, Tobias; Franklin, Jeremy; Baethge, Christopher

    2016-02-01

    Comparisons of illness characteristics between patients with schizoaffective disorder (SAD) patients and unipolar depression (UD) are rare, even though UD is one of the most important differential diagnoses of SAD. Also, the variability of illness characteristics (heterogeneity) has not been compared. We compared illness characteristics and their heterogeneity among SAD, UD, and - as another important differential diagnosis - schizophrenia (S). In order to reduce sampling bias we systematically searched for studies simultaneously comparing samples of patients with SAD, UD, and S. Using random effects and Mantel-Haenszel models we estimated and compared demographic, illness course and psychopathology parameters, using pooled standard deviations as a measurement of heterogeneity. Out of 155 articles found by an earlier meta-analysis, 765 screened in Medline, 2738 screened in EMBASE, and 855 screened in PsycINFO we selected 24 studies, covering 3714 patients diagnosed according to RDC, DSM-III, DSM-IIIR, DSM-IV, or ICD-10. In almost all key characteristics, samples with schizoaffective disorders fell between unipolar depression and schizophrenia, with a tendency towards schizophrenia. On average, UD patients were significantly older at illness onset (33.0 years, SAD: 25.2, S: 23.4), more often women (59% vs. 57% vs. 39%) and more often married (53% vs. 39% vs. 27%). Their psychopathology was also less severe, as measured by BPRS, GAS, and HAMD. In demographic and clinical variables heterogeneity was roughly 5% larger in UD than in SAD, and samples of patients with schizophrenia had the lowest pooled heterogeneity. A similar picture emerged in a sensitivity analysis with coefficient of variation as the measurement of heterogeneity. Relative to bipolar disorder there are fewer studies including unipolar patients. No studies based on DSM-5 could be included. Regarding unipolar affective disorder this study confirms what we have shown for bipolar disorders in earlier

  17. Morphosyntactic Neural Analysis for Generalized Lexical Normalization

    Science.gov (United States)

    Leeman-Munk, Samuel Paul

    2016-01-01

    The phenomenal growth of social media, web forums, and online reviews has spurred a growing interest in automated analysis of user-generated text. At the same time, a proliferation of voice recordings and efforts to archive culture heritage documents are fueling demand for effective automatic speech recognition (ASR) and optical character…

  18. General principles of neutron activation analysis

    International Nuclear Information System (INIS)

    Dostal, J.; Elson, C.

    1980-01-01

    Aspects of the principles of atomic and nuclear structure and the processes of radioactivity, nuclear transformation, and the interaction of radiations with matter which are of direct relevance to neutron activation analysis and its application to geologic materials are discussed. (L.L.)

  19. General analysis of HERA II data

    International Nuclear Information System (INIS)

    Schoening, A

    2008-01-01

    A model-independent search for deviations from the Standard Model prediction is performed in e ± p collisions. Data collected in the years 2003-2007 corresponding to an integrated luminosity of about 340 pb -1 are analyzed. All event topologies involving isolated electrons, photons, muons, neutrinos and jets with high transverse momenta are investigated in a single analysis. Events are assigned to exclusive classes according to their final state. A statistical algorithm is applied to search for deviations from the Standard Model in the distributions of the scalar sum of transverse momenta or invariant mass of final state particles and to quantify their significance. A good agreement with the Standard Model prediction is observed in most of the event classes. No significant deviation is observed in the phase-space and in the event topologies covered by this analysis

  20. Generalized Fluid System Simulation Program (GFSSP) Version 6 - General Purpose Thermo-Fluid Network Analysis Software

    Science.gov (United States)

    Majumdar, Alok; Leclair, Andre; Moore, Ric; Schallhorn, Paul

    2011-01-01

    GFSSP stands for Generalized Fluid System Simulation Program. It is a general-purpose computer program to compute pressure, temperature and flow distribution in a flow network. GFSSP calculates pressure, temperature, and concentrations at nodes and calculates flow rates through branches. It was primarily developed to analyze Internal Flow Analysis of a Turbopump Transient Flow Analysis of a Propulsion System. GFSSP development started in 1994 with an objective to provide a generalized and easy to use flow analysis tool for thermo-fluid systems.

  1. Basic general concepts in the network analysis

    Directory of Open Access Journals (Sweden)

    Boja Nicolae

    2004-01-01

    Full Text Available This survey is concerned oneself with the study of those types of material networks which can be met both in civil engineering and also in electrotechnics, in mechanics, or in hydrotechnics, and of which behavior lead to linear problems, solvable by means of Finite Element Method and adequate algorithms. Here, it is presented a unitary theory of networks met in the domains mentioned above and this one is illustrated with examples for the structural networks in civil engineering, electric circuits, and water supply networks, but also planar or spatial mechanisms can be comprised in this theory. The attention is focused to make evident the essential proper- ties and concepts in the network analysis, which differentiate the networks under force from other types of material networks. To such a network a planar, connected, and directed or undirected graph is associated, and with some vector fields on the vertex set this graph is endowed. .

  2. Analysis of 162 colon injuries in patients with penetrating abdominal trauma: concomitant stomach injury results in a higher rate of infection.

    Science.gov (United States)

    O'Neill, Patricia A; Kirton, Orlando C; Dresner, Lisa S; Tortella, Bartholomew; Kestner, Mark M

    2004-02-01

    Fecal contamination from colon injury has been thought to be the most significant factor for the development of surgical site infection (SSI) after trauma. However, there are increasing data to suggest that other factors may play a role in the development of postinjury infection in patients after colon injury. The purpose of this study was to determine the impact of gastric wounding on the development of SSI and nonsurgical site infection (NSSI) in patients with colon injury. Post hoc analysis was performed on data prospectively collected for 317 patients presenting with penetrating hollow viscus injury. One hundred sixty-two patients with colon injury were subdivided into one of three groups: patients with isolated colon wounds (C), patients with colon and stomach wounds with or without other organ injury (C+S), and patients with colon and other organ injury but no stomach injury (C-S) and assessed for the development of SSI and NSSI. Infection rates were also determined for patients who sustained isolated gastric injury (S) and gastric injury in combination with other injuries other than colon (S-C). Penetrating Abdominal Trauma Index, operative times, and transfusion were assessed. Discrete variables were analyzed by Cochran-Mantel-Haenszel chi2 test and Fisher's exact test. Risk factor analysis was performed by multivariate logistic regression. C+S patients had a higher rate of SSI infection (31%) than C patients (3.6%) (p=0.008) and C-S patients (13%) (p=0.021). Similarly, the incidence of NSSI was also significantly greater in the C+S group (37%) compared with the C patients (7.5%) (p=0.07) and the C-S patients (17%) (p=0.019). There was no difference in the rate of SSI or NSSI between the C and C-S groups (p=0.3 and p=0.24, respectively). The rate of SSI was significantly greater in the C+S patients when compared with the S-C patients (31% vs. 10%, p=0.008), but there was no statistical difference in the rate of NSSI in the C+S group and the S-C group (37

  3. Prophylactic therapy with omeprazole for prevention of equine gastric ulcer syndrome (EGUS) in horses in active training: A meta-analysis.

    Science.gov (United States)

    Mason, L V; Moroney, J R; Mason, R J

    2018-04-17

    Guidelines regarding the impact and value of prophylaxis or maintenance therapy in equine gastric ulcer syndrome (EGUS) are not well-established or defined. The merits and the magnitude of effects of prophylaxis for spontaneous or recurrent squamous gastric ulceration in horses in training are uncertain. To pool data from randomised controlled trials (RCTs) to eliminate reporting bias and evaluate the efficacy of prophylactic omeprazole in the prevention of EGUS in training horses, and secondarily to compare prophylactic dosages of omeprazole. Meta-analysis. This meta-analysis was conducted according to the recommendations of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. A systematic literature search identified RCTs comparing omeprazole prophylaxis with sham in prevention of EGUS. Data were analysed using the Mantel-Haenszel test method to calculate risk ratio (RR) or mean difference (MD) with 95% confidence intervals (CIs). Primary outcome was efficacy of prophylaxis. Secondary outcome was endoscopic severity of ulceration. The influence of study characteristics on the outcomes was examined by subgroup analyses. In preventing gastric ulcer occurrence, omeprazole prophylaxis was superior to sham in training horses (7 trials, 566 horses, RR 0.28, 95% CI 0.18-0.43; 23.4% in omeprazole prophylaxis vs. 77.2% in sham; high quality evidence). Prevalence of ulceration was 75.3 and 87.2% in the sham arms of the 1 mg/kg and 2 mg/kg omeprazole groups, respectively. Severity scores were significantly lower for omeprazole vs. sham (mean difference [MD] -1.05; 95% CI -1.35 to -0.69). Subgroup analyses comparing prophylactic omeprazole dosages resulted in a mean difference of -0.94 and -1.60 for the 1 and 2 mg/kg groups, respectively. Studies showed heterogeneity with regard to prophylactic dose. Omeprazole prophylaxis in active training horses significantly reduces gastric ulceration compared with no prophylaxis (sham) with the

  4. Coronary Computed Tomography Angiography vs Functional Stress Testing for Patients With Suspected Coronary Artery Disease: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Foy, Andrew J; Dhruva, Sanket S; Peterson, Brandon; Mandrola, John M; Morgan, Daniel J; Redberg, Rita F

    2017-11-01

    Coronary computed tomography angiography (CCTA) is a new approach for the diagnosis of anatomical coronary artery disease (CAD), but it is unclear how CCTA performs compared with the standard approach of functional stress testing. To compare the clinical effectiveness of CCTA with that of functional stress testing for patients with suspected CAD. A systematic literature search was conducted in PubMed and MEDLINE for English-language randomized clinical trials of CCTA published from January 1, 2000, to July 10, 2016. Researchers selected randomized clinical trials that compared a primary strategy of CCTA with that of functional stress testing for patients with suspected CAD and reported data on patient clinical events and changes in therapy. Two reviewers independently extracted data from and assessed the quality of the trials. This analysis followed the PRISMA statement for reporting systematic reviews and meta-analyses and used the Cochrane Collaboration's tool for assessing risk of bias in randomized trials. The Mantel-Haenszel method was used to conduct the primary analysis. Summary relative risks were calculated with a random-effects model. The outcomes of interest were all-cause mortality, cardiac hospitalization, myocardial infarction, invasive coronary angiography, coronary revascularization, new CAD diagnoses, and change in prescription for aspirin and statins. Thirteen trials were included, with 10 315 patients in the CCTA arm and 9777 patients in the functional stress testing arm who were followed up for a mean duration of 18 months. There were no statistically significant differences between CCTA and functional stress testing in death (1.0% vs 1.1%; risk ratio [RR], 0.93; 95% CI, 0.71-1.21) or cardiac hospitalization (2.7% vs 2.7%; RR, 0.98; 95% CI, 0.79-1.21), but CCTA was associated with a reduction in the incidence of myocardial infarction (0.7% vs 1.1%; RR, 0.71; 95% CI, 0.53-0.96). Patients undergoing CCTA were significantly more likely to undergo

  5. The Safety of Artemisinin Derivatives for the Treatment of Malaria in the 2nd or 3rd Trimester of Pregnancy: A Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Stephanie D Kovacs

    Full Text Available Given the high morbidity for mother and fetus associated with malaria in pregnancy, safe and efficacious drugs are needed for treatment. Artemisinin derivatives are the most effective antimalarials, but are associated with teratogenic and embryotoxic effects in animal models when used in early pregnancy. However, several organ systems are still under development later in pregnancy. We conducted a systematic review and meta-analysis of the occurrence of adverse pregnancy outcomes among women treated with artemisinins monotherapy or as artemisinin-based combination therapy during the 2nd or 3rd trimesters relative to pregnant women who received non-artemisinin antimalarials or none at all. Pooled odds ratio (POR were calculated using Mantel-Haenszel fixed effects model with a 0.5 continuity correction for zero events. Eligible studies were identified through Medline, Embase, and the Malaria in Pregnancy Consortium Library. Twenty studies (11 cohort studies and 9 randomized controlled trials contributed to the analysis, with 3,707 women receiving an artemisinin, 1,951 a non-artemisinin antimalarial, and 13,714 no antimalarial. The PORs (95% confidence interval (CI for stillbirth, fetal loss, and congenital anomalies when comparing artemisinin versus quinine were 0.49 (95% CI 0.24-0.97, I2 = 0%, 3 studies; 0.58 (95% CI 0.31-1.16, I2 = 0%, 6 studies; and 1.00 (95% CI 0.27-3.75, I2 = 0%, 3 studies, respectively. The PORs comparing artemisinin users to pregnant women who received no antimalarial were 1.13 (95% CI 0.77-1.66, I2 = 86.7%, 3 studies; 1.10 (95% CI 0.79-1.54, I2 = 0%, 4 studies; and 0.79 (95% CI 0.37-1.67, I2 = 0%, 3 studies for miscarriage, stillbirth and congenital anomalies respectively. Treatment with artemisinin in 2nd and 3rd trimester was not associated with increased risks of congenital malformations or miscarriage and may be was associated with a reduced risk of stillbirths compared to quinine. This study updates the reviews

  6. Use of probiotics in the treatment of severe acute pancreatitis: a systematic review and meta-analysis of randomized controlled trials

    Science.gov (United States)

    2014-01-01

    Introduction Necrotic tissue infection can worsen the prognosis of severe acute pancreatitis (SAP), and probiotics have been shown to be beneficial in reducing the infection rate in animal experiments and primary clinical trials. However, the results of multicenter randomized clinical trials have been contradictory. Our aim in this study was to systematically review and quantitatively analyze all randomized controlled trials with regard to important outcomes in patients with predicted SAP who received probiotics. Methods A systematic literature search of the PubMed, Embase and Cochrane Library databases was conducted using specific search terms. Eligible studies were randomized controlled trials that compared the effects of probiotic with placebo treatment in patients with predicted SAP. Mean difference (MD), risk ratio (RR) and 95% confidence interval (95% CI) were calculated using the Mantel-Haenszel fixed- and random-effects models. A meta-analysis on the use of probiotics in the treatment of critically ill patients was also performed to serve as a reference. Results In this study, 6 trials comprising an aggregate total of 536 patients were analyzed. Significant heterogeneities were observed in the type, dose, treatment duration and clinical effects of probiotics in these trials. Systematic analysis showed that probiotics did not significantly affect the pancreatic infection rate (RR = 1.19, 95% CI = 0.74 to 1.93; P = 0.47), total infections (RR = 1.09, 95% CI = 0.80 to 1.48; P = 0.57), operation rate (RR = 1.42, 95% CI = 0.43 to 3.47; P = 0.71), length of hospital stay (MD = 2.45, 95% CI = −2.71 to 7.60; P = 0.35) or mortality (RR = 0.72, 95% CI = 0.42 to 1.45; P = 0.25). Conclusions Probiotics showed neither beneficial nor adverse effects on the clinical outcomes of patients with predicted SAP. However, significant heterogeneity was noted between the trials reviewed with regard to the type, dose and

  7. Discourse analysis in general practice: a sociolinguistic approach.

    Science.gov (United States)

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here.

  8. Systematic review and meta-analysis of serious infections with tofacitinib and biologic disease-modifying antirheumatic drug treatment in rheumatoid arthritis clinical trials.

    Science.gov (United States)

    Strand, Vibeke; Ahadieh, Sima; French, Jonathan; Geier, Jamie; Krishnaswami, Sriram; Menon, Sujatha; Checchio, Tina; Tensfeldt, Thomas G; Hoffman, Elaine; Riese, Richard; Boy, Mary; Gómez-Reino, Juan J

    2015-12-15

    Tofacitinib is an oral Janus kinase inhibitor for the treatment of rheumatoid arthritis (RA). Tofacitinib modulates the signaling of cytokines that are integral to lymphocyte activation, proliferation, and function. Thus, tofacitinib therapy may result in suppression of multiple elements of the immune response. Serious infections have been reported in tofacitinib RA trials. However, limited head-to-head comparator data were available within the tofacitinib RA development program to directly compare rates of serious infections with tofacitinib relative to biologic agents, and specifically adalimumab (employed as an active control agent in two randomized controlled trials of tofacitinib). A systematic literature search of data from interventional randomized controlled trials and long-term extension studies with biologics in RA was carried out. Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) consensus was followed for reporting results of the review and meta-analysis. Incidence rates (unique patients with events/100 patient-years) for each therapy were estimated based on data from randomized controlled trials and long-term extension studies using a random-effects model. Relative and absolute risk comparisons versus placebo used Mantel-Haenszel methods. The search produced 657 hits. In total, 66 randomized controlled trials and 22 long-term extension studies met the selection criteria. Estimated incidence rates (95% confidence intervals [CIs]) for abatacept, rituximab, tocilizumab, and tumor necrosis factor inhibitors were 3.04 (2.49, 3.72), 3.72 (2.99, 4.62), 5.45 (4.26, 6.96), and 4.90 (4.41, 5.44), respectively. Incidence rates (95% CIs) for tofacitinib 5 and 10 mg twice daily (BID) in phase 3 trials were 3.02 (2.25, 4.05) and 3.00 (2.24, 4.02), respectively. Corresponding incidence rates in long-term extension studies were 2.50 (2.05, 3.04) and 3.19 (2.74, 3.72). The risk ratios (95% CIs) versus placebo for tofacitinib 5 and 10 mg BID

  9. Efficacy and Safety of Metronidazole Monotherapy versus Vancomycin Monotherapy or Combination Therapy in Patients with Clostridium difficile Infection: A Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Rui Li

    Full Text Available Clostridium difficile infection (CDI has become a global epidemiological problem for both hospitalized patients and outpatients. The most commonly used drugs to treat CDI are metronidazole and vancomycin. The aim of this study was to compare the efficacy and safety of metronidazole monotherapy with vancomycin monotherapy and combination therapy in CDI patients.A comprehensive search without publication status or other restrictions was conducted. Studies comparing metronidazole monotherapy with vancomycin monotherapy or combination therapy in patients with CDI were considered eligible. Meta-analysis was performed using the Mantel-Haenszel fixed-effects model, and odds ratios (ORs with 95% confidence intervals (95% CIs were calculated and reported.Of the 1910 records identified, seventeen studies from thirteen articles (n = 2501 patients were included. No statistically significant difference in the rate of clinical cure was found between metronidazole and vancomycin for mild CDI (OR = 0.67, 95% CI (0.45, 1.00, p = 0.05 or between either monotherapy and combination therapy for CDI (OR = 1.07, 95% CI (0.58, 1.96, p = 0.83; however, the rate of clinical cure was lower for metronidazole than for vancomycin for severe CDI (OR = 0.46, 95% CI (0.26, 0.80, p = 0.006. No statistically significant difference in the rate of CDI recurrence was found between metronidazole and vancomycin for mild CDI (OR = 0.99, 95% CI (0.40, 2.45, p = 0.98 or severe CDI (OR = 0.98, 95% CI (0.63, 1.53, p = 0.94 or between either monotherapy and combination therapy for CDI (OR = 0.91, 95% CI (0.66, 1.26, p = 0.56. In addition, there was no significant difference in the rate of adverse events (AEs between metronidazole and vancomycin (OR = 1.18, 95% CI (0.80, 1.74, p = 0.41. In contrast, the rate of AEs was significantly lower for either monotherapy than for combination therapy (OR = 0.30, 95% CI (0.17, 0.51, p < 0.0001.Metronidazole and vancomycin are equally effective for the

  10. MAGMA: generalized gene-set analysis of GWAS data.

    NARCIS (Netherlands)

    de Leeuw, C.A.; Mooij, J.M.; Heskes, T.; Posthuma, D.

    2015-01-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical

  11. MAGMA: Generalized Gene-Set Analysis of GWAS Data

    NARCIS (Netherlands)

    de Leeuw, C.A.; Mooij, J.M.; Heskes, T.; Posthuma, D.

    2015-01-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical

  12. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  13. Systematic review and meta-analysis of L1-VLP-based human papillomavirus vaccine efficacy against anogenital pre-cancer in women with evidence of prior HPV exposure.

    Directory of Open Access Journals (Sweden)

    Ada Miltz

    Full Text Available BACKGROUND: It is unclear whether L1-VLP-based human papillomavirus (HPV vaccines are efficacious in reducing the likelihood of anogenital pre-cancer in women with evidence of prior vaccine-type HPV exposure. This study aims to determine whether the combined results of the vaccine trials published to date provide evidence of efficacy compared with control (hepatitis A vaccine/placebo. METHODS: A systematic review and meta-analysis was conducted. Randomized-controlled trials (RCTs were identified from MEDLINE, Embase, Web of Science, PubMed, Cochrane Central Register of Controlled Trials and references of identified studies. The bivalent vaccine containing HPV-16 and 18 VLPs from GlaxoSmithKline Biologicals (Rixenstart, Belgium, the quadrivalent vaccine containing HPV-6, 11, 16, and 18 VLPs from Merck & Co., Inc., (Whitehouse Station, NJ USA, and the HPV-16 monovalent vaccine from Merck Research Laboratories (West Point, PA USA were evaluated. FINDINGS: Three RCT reports and two post-trial cohort studies were eligible, comprising data from 13,482 women who were included in the vaccine studies but had evidence of HPV infection at study entry. Data on efficacy was synthesized using the Mantel-Haenszel weighted fixed-effect approach, or where there was heterogeneity between studies, the DerSimonian and Laird weighted random-effect approach. The mean odds ratio (OR and 95% confidence interval (CI for the association between Cervarix, Gardasil and HPV-16 monovalent vaccine and HPV-associated cervical intraepithelial neoplasia grade 3 or worse was 0·90 (95% CI: 0·56, 1·44. For the association between Gardasil and HPV-associated vulval/vaginal intraepithelial neoplasia grades 2-3, the overall OR and 95% CI was 2.25 (95% CI: 0·78, 6.50. Sample size and follow-up were limited. CONCLUSIONS: There was no evidence that HPV vaccines are effective in preventing vaccine-type HPV associated pre-cancer in women with evidence of prior HPV exposure. Small

  14. Harmonic Analysis Associated with the Generalized Weinstein Operator

    Directory of Open Access Journals (Sweden)

    Ahmed Abouelaz

    2015-11-01

    Full Text Available In this paper we consider a generalized Weinstein operator ∆d,α,n on Rd−1×]0,∞[, which generalizes the Weinstein operator ∆d,α, we define the generalized Weinstein intertwining operator Rα,n which turn out to be transmutation operator between ∆d,α,n and the Laplacian operator ∆d. We build the dual of the generalized Weinstein intertwining operatortRα,n, another hand we prove the formula related Rα,n andtRα,n . We exploit these transmutation operators to develop a new harmonic analysis corresponding to ∆d,α,n.

  15. A new classification of HLA-DRB1 alleles based on acid-base properties of the amino acids located at positions 13, 70 and 71: impact on ACPA status or structural progression, and meta-analysis on 1235 patients with rheumatoid from two cohorts (ESPOIR and EAC cohort).

    Science.gov (United States)

    Ruyssen-Witrand, Adeline; van Steenbergen, Hanna W; van Heemst, Jurgen; Gourraud, Pierre-Antoine; Nigon, Delphine; Lukas, Cédric; Miceli-Richard, Corinne; Jamard, Bénédicte; Cambon-Thomsen, Anne; Cantagrel, Alain; Dieudé, Philippe; van der Helm-van Mil, Annette H M; Constantin, Arnaud

    2015-01-01

    To group HLA-DRB1 alleles based on acid-base properties of amino acids at positions 13, 70 and 71 and analyse their association with the presence of anticitrullinated peptide antibodies (ACPA) and structural progression in 2 cohorts of early rheumatoid arthritis (RA). Patients with RA (N=612) from ESPOIR cohort and from EAC cohort (n=624) were genotyped for HLA-DRB1 alleles. The alleles containing the RAA sequence at positions 72-74 were classified into 3 groups according to the amino acid at positions 13, 70 and 71: BB encoding basic amino acids at positions 13, 70 and 71; A encoding acidic amino acids at positions 70 and 71; and BN encoding either neutral amino acids at position 13 and basic amino acids at positions 70 and 71, or basic amino acid at position 13 and neutral amino acids at positions 70 and 71. The associations between the different alleles and (1) the ACPA presence, and (2) the structural progression were assessed by χ(2) test; a meta-analysis was performed on the 2 cohorts using the Mantel-Haenszel method. After meta-analysis, BB alleles were significantly associated with ACPA presence (OR (95% CI) 4.08 (3.14 to 5.31)) and structural progression (OR (95% CI) 2.33 (1.76 to 3.09)). The alleles protected significantly against ACPA presence (OR (95% CI) 0.37 (0.28 to 0.50)) and structural progression (OR (95% CI) 0.34 (0.23 to 0.50)). This acid-base classification allowed to separate another group BN with an intermediate risk of ACPA production (OR (95% CI) 1.14 (0.91 to 1.44)) and structural progression (OR (95% CI) 1.01 (0.77 to 1.33)). This new classification permitted to make a hierarchy of HLA-DRB1 alleles in terms of association with ACPA presence or structural progression in early RA.

  16. Structural dynamic analysis with generalized damping models analysis

    CERN Document Server

    Adhikari , Sondipon

    2013-01-01

    Since Lord Rayleigh introduced the idea of viscous damping in his classic work ""The Theory of Sound"" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping. Over the past decade, extensive research has been undertaken on more general ""non-viscous"" damping models and vibration of non-viscously damped systems. This book, along with a related book

  17. Analysis of Lamellar Structures with Application of Generalized Functions

    Directory of Open Access Journals (Sweden)

    Kipiani Gela

    2016-12-01

    Full Text Available Theory of differential equations in respect of the functional area is based on the basic concepts on generalized functions and splines. There are some basic concepts related to the theory of generalized functions and their properties are considered in relation to the rod systems and lamellar structures. The application of generalized functions gives the possibility to effectively calculate step-variable stiffness lamellar structures. There are also widely applied structures, in that several in which a number of parallel load bearing layers are interconnected by discrete-elastic links. For analysis of system under study, such as design diagrams, there are applied discrete and discrete-continual models.

  18. Multifaceted shared care intervention for late life depression in residential care: randomised controlled trial.

    Science.gov (United States)

    Llewellyn-Jones, R H; Baikie, K A; Smithers, H; Cohen, J; Snowdon, J; Tennant, C C

    1999-09-11

    To evaluate the effectiveness of a population based, multifaceted shared care intervention for late life depression in residential care. Randomised controlled trial, with control and intervention groups studied one after the other and blind follow up after 9.5 months. Population of residential facility in Sydney living in self care units and hostels. 220 depressed residents aged >/=65 without severe cognitive impairment. The shared care intervention included: (a) multidisciplinary consultation and collaboration, (b) training of general practitioners and carers in detection and management of depression, and (c) depression related health education and activity programmes for residents. The control group received routine care. Geriatric depression scale. Intention to treat analysis was used. There was significantly more movement to "less depressed" levels of depression at follow up in the intervention than control group (Mantel-Haenszel stratification test, P=0.0125). Multiple linear regression analysis found a significant intervention effect after controlling for possible confounders, with the intervention group showing an average improvement of 1.87 points on the geriatric depression scale compared with the control group (95% confidence interval 0.76 to 2.97, P=0.0011). The outcome of depression among elderly people in residential care can be improved by multidisciplinary collaboration, by enhancing the clinical skills of general practitioners and care staff, and by providing depression related health education and activity programmes for residents.

  19. MAGMA: generalized gene-set analysis of GWAS data.

    Science.gov (United States)

    de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle

    2015-04-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.

  20. Generalized fault tree analysis combined with state analysis

    International Nuclear Information System (INIS)

    Caldarola, L.

    1980-02-01

    An analytical theory has been developed which allows one to calculate the occurrence probability of the top event of a fault tree with multistate (two or more than two states) components. It is shown that, in order to correctly describe a system with multistate components, a special type of boolean algebra is required. This is called 'boolean algebra with restrictions on variables' and its basic rules are the same as those of the traditional boolean algebra with some additional restrictions on the variables. These restrictions are extensively discussed in the paper. It is also shown that the boolean algebra with restrictions on variables facilitates the task of formally combining fault tree analysis with state analysis. The computer program MUSTAFA 1 based on the above theory has been developed. It can analyse fault trees of system containing statistically independent as well as dependent components with two or more than two states. MUSTAFA 1 can handle coherent as well as non coherent boolean functions. (orig.) 891 HP/orig. 892 MB [de

  1. Variational analysis and generalized differentiation I basic theory

    CERN Document Server

    Mordukhovich, Boris S

    2006-01-01

    Contains a study of the basic concepts and principles of variational analysis and generalized differentiation in both finite-dimensional and infinite-dimensional spaces. This title presents many applications to problems in optimization, equilibria, stability and sensitivity, control theory, economics, mechanics, and more.

  2. Texture Analysis Using Rényi’s Generalized Entropies

    NARCIS (Netherlands)

    Grigorescu, S.E.; Petkov, N.

    2003-01-01

    We propose a texture analysis method based on Rényi’s generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The

  3. Psychological treatment of generalized anxiety disorder: A meta-analysis.

    NARCIS (Netherlands)

    Cuijpers, P.; Sijbrandij, M.; Koole, S.L.; Huibers, M.J.H.; Berking, M.; Andersson, G.

    2014-01-01

    Recent years have seen a near-doubling of the number of studies examining the effects of psychotherapies for generalized anxiety disorder (GAD) in adults. The present article integrates this new evidence with the older literature through a quantitative meta-analysis. A total of 41 studies (with 2132

  4. Stability analysis for a general age-dependent vaccination model

    International Nuclear Information System (INIS)

    El Doma, M.

    1995-05-01

    An SIR epidemic model of a general age-dependent vaccination model is investigated when the fertility, mortality and removal rates depends on age. We give threshold criteria of the existence of equilibriums and perform stability analysis. Furthermore a critical vaccination coverage that is sufficient to eradicate the disease is determined. (author). 12 refs

  5. Complexity analysis based on generalized deviation for financial markets

    Science.gov (United States)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  6. Extreme learning machine for ranking: generalization analysis and applications.

    Science.gov (United States)

    Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin

    2014-05-01

    The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Risk factors for developing tooth sensitivity and gingival irritation associated with nightguard vital bleaching.

    Science.gov (United States)

    Leonard, R H; Haywood, V B; Phillips, C

    1997-08-01

    The purpose of this study was to determine risk factors in the development of tooth sensitivity and gingival irritation associated with the nightguard vital bleaching technique. The potential risk factors evaluated (sex, age, reported allergy, whitening solution, number of times the solution was changed daily [its usage pattern], and dental arch) were collected from the daily log form turned in by each of the 64 participants after completion of the 6-week lightening process. Also evaluated for each participant, from color slides, were tooth characteristics such as gingival recession, defective restorations, abfraction lesions, enamel-cementum abrasion, etc, and reported side effects. The generalized Mantel-Haenszel statistic was used to assess the association between the potential risk factors and the development of tooth sensitivity and/or gingival irritation. No statistical relationship existed between age, sex, allergy, tooth characteristics, or the dental arch lightened and the development of side effects. Initially, a statistically significant association existed between side effects and the whitening solution used. However, when the analysis was controlled for usage pattern, this relationship disappeared. Patients who changed the whitening solution more than once a day reported statistically significantly more side effects than did those who did not change the whitening solution during their usage time.

  8. A Community-Based Randomized Trial of Hepatitis B Screening Among High-Risk Vietnamese Americans.

    Science.gov (United States)

    Ma, Grace X; Fang, Carolyn Y; Seals, Brenda; Feng, Ziding; Tan, Yin; Siu, Philip; Yeh, Ming Chin; Golub, Sarit A; Nguyen, Minhhuyen T; Tran, Tam; Wang, Minqi

    2017-03-01

    To evaluate the effectiveness of a community-based liver cancer prevention program on hepatitis B virus (HBV) screening among low-income, underserved Vietnamese Americans at high risk. We conducted a cluster randomized trial involving 36 Vietnamese community-based organizations and 2337 participants in Pennsylvania, New Jersey, and New York City between 2009 and 2014. We randomly assigned 18 community-based organizations to a community-based multilevel HBV screening intervention (n = 1131). We randomly assigned the remaining 18 community-based organizations to a general cancer education program (n = 1206), which included information about HBV-related liver cancer prevention. We assessed HBV screening rates at 6-month follow-up. Intervention participants were significantly more likely to have undergone HBV screening (88.1%) than were control group participants (4.6%). In a Cochran-Mantel-Haenszel analysis, the intervention effect on screening outcomes remained statistically significant after adjustment for demographic and health care access variables, including income, having health insurance, having a regular health provider, and English proficiency. A community-based, culturally appropriate, multilevel HBV screening intervention effectively increases screening rates in a high-risk, hard-to-reach Vietnamese American population.

  9. Notes on testing equality and interval estimation in Poisson frequency data under a three-treatment three-period crossover trial.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2016-10-01

    When the frequency of event occurrences follows a Poisson distribution, we develop procedures for testing equality of treatments and interval estimators for the ratio of mean frequencies between treatments under a three-treatment three-period crossover design. Using Monte Carlo simulations, we evaluate the performance of these test procedures and interval estimators in various situations. We note that all test procedures developed here can perform well with respect to Type I error even when the number of patients per group is moderate. We further note that the two weighted-least-squares (WLS) test procedures derived here are generally preferable to the other two commonly used test procedures in the contingency table analysis. We also demonstrate that both interval estimators based on the WLS method and interval estimators based on Mantel-Haenszel (MH) approach can perform well, and are essentially of equal precision with respect to the average length. We use a double-blind randomized three-treatment three-period crossover trial comparing salbutamol and salmeterol with a placebo with respect to the number of exacerbations of asthma to illustrate the use of these test procedures and estimators. © The Author(s) 2014.

  10. Application of generalized function to dynamic analysis of thick plates

    International Nuclear Information System (INIS)

    Zheng, D.; Weng, Z.

    1987-01-01

    The structures with thick plates have been used extensively in national defence, mechanical engineering, chemical engineering, nuclear engineering, civil engineering, etc.. Various theories have been established to deal with the problems of elastic plates, which include the classical theory of thin plates, the improved theory of thick plates, three-dimensional elastical theory. In this paper, the derivative of δ-function is handled by using the generalized function. The dynamic analysis of thick plates subjected the concentrated load is presented. The improved Donnell's equation of thick plates is deduced and employed as the basic equation. The generalized coordinates are solved by using the method of MWR. The general expressions for the dynamic response of elastic thick plates subjected the concentrated load are given. The numerical results for rectangular plates are given herein. The results are compared with those obtained from the improved theory and the classical theory of plates. (orig./GL)

  11. Generalized concavity in fuzzy optimization and decision analysis

    CERN Document Server

    Ramík, Jaroslav

    2002-01-01

    Convexity of sets in linear spaces, and concavity and convexity of functions, lie at the root of beautiful theoretical results that are at the same time extremely useful in the analysis and solution of optimization problems, including problems of either single objective or multiple objectives. Not all of these results rely necessarily on convexity and concavity; some of the results can guarantee that each local optimum is also a global optimum, giving these methods broader application to a wider class of problems. Hence, the focus of the first part of the book is concerned with several types of generalized convex sets and generalized concave functions. In addition to their applicability to nonconvex optimization, these convex sets and generalized concave functions are used in the book's second part, where decision-making and optimization problems under uncertainty are investigated. Uncertainty in the problem data often cannot be avoided when dealing with practical problems. Errors occur in real-world data for...

  12. General classification and analysis of neutron β-decay experiments

    International Nuclear Information System (INIS)

    Gudkov, V.; Greene, G.L.; Calarco, J.R.

    2006-01-01

    A general analysis of the sensitivities of neutron β-decay experiments to manifestations of possible interaction beyond the standard model is carried out. In a consistent fashion, we take into account all known radiative and recoil corrections arising in the standard model. This provides a description of angular correlations in neutron decay in terms of one parameter, which is accurate to the level of ∼10 -5 . Based on this general expression, we present an analysis of the sensitivities to new physics for selected neutron decay experiments. We emphasize that the usual parametrization of experiments in terms of the tree-level coefficients a,A, and B is inadequate when the experimental sensitivities are at the same or higher level relative to the size of the corrections to the tree-level description

  13. Tritium analysis of urine samples from the general Korean public.

    Science.gov (United States)

    Yoon, Seokwon; Ha, Wi-Ho; Lee, Seung-Sook

    2013-11-01

    The tritium concentrations of urine samples and the effective dose of the general Korean public were evaluated. To achieve accurate HTO analysis of urine samples, we established the optimal conditions for measuring the HTO content of urine samples. Urine samples from 50 Koreans who do not work at a nuclear facility were analyzed on the basis of the results. The average urine analysis result was 2.8 ±1 .4 Bq/L, and the range was 1.8-5.6 Bq/L. The measured values were lower than those reported for other countries. These results show that environmental factors and lifestyle differences are the main factors affecting the tritium level of the general public. © 2013 Elsevier Ltd. All rights reserved.

  14. A general numerical analysis of the superconducting quasiparticle mixer

    Science.gov (United States)

    Hicks, R. G.; Feldman, M. J.; Kerr, A. R.

    1985-01-01

    For very low noise millimeter-wave receivers, the superconductor-insulator-superconductor (SIS) quasiparticle mixer is now competitive with conventional Schottky mixers. Tucker (1979, 1980) has developed a quantum theory of mixing which has provided a basis for the rapid improvement in SIS mixer performance. The present paper is concerned with a general method of numerical analysis for SIS mixers which allows arbitrary terminating impedances for all the harmonic frequencies. This analysis provides an approach for an examination of the range of validity of the three-frequency results of the quantum mixer theory. The new method has been implemented with the aid of a Fortran computer program.

  15. General overview and perspectives of risk analysis in Cuba

    International Nuclear Information System (INIS)

    Torres, A.; Rodriguez, J.M.; Vilaragut, J.J.; Valhuerdi, C.

    1995-01-01

    This papers shows a general overview of the application of risk analysis techniques in some potentially dangerous industries in Cuba. This paper summarizes the experiences of these sectors in the risk analysis with different specification levels and different approaches. Some experiences in the application of these analyses in the nuclear and aeronautical industries are shown. Some analyses of consequences in cases of accidents in the chemical industries in order to work due and improve emergency plans for responding to accident situations are presented in a more succinct manner. Also the perspectives to develop some of these tendencies and cooperation forms between them are summarized

  16. A general numerical analysis program for the superconducting quasiparticle mixer

    Science.gov (United States)

    Hicks, R. G.; Feldman, M. J.; Kerr, A. R.

    1986-01-01

    A user-oriented computer program SISCAP (SIS Computer Analysis Program) for analyzing SIS mixers is described. The program allows arbitrary impedance terminations to be specified at all LO harmonics and sideband frequencies. It is therefore able to treat a much more general class of SIS mixers than the widely used three-frequency analysis, for which the harmonics are assumed to be short-circuited. An additional program, GETCHI, provides the necessary input data to program SISCAP. The SISCAP program performs a nonlinear analysis to determine the SIS junction voltage waveform produced by the local oscillator. The quantum theory of mixing is used in its most general form, treating the large signal properties of the mixer in the time domain. A small signal linear analysis is then used to find the conversion loss and port impedances. The noise analysis includes thermal noise from the termination resistances and shot noise from the periodic LO current. Quantum noise is not considered. Many aspects of the program have been adequately verified and found accurate.

  17. A general first-order global sensitivity analysis method

    International Nuclear Information System (INIS)

    Xu Chonggang; Gertner, George Zdzislaw

    2008-01-01

    Fourier amplitude sensitivity test (FAST) is one of the most popular global sensitivity analysis techniques. The main mechanism of FAST is to assign each parameter with a characteristic frequency through a search function. Then, for a specific parameter, the variance contribution can be singled out of the model output by the characteristic frequency. Although FAST has been widely applied, there are two limitations: (1) the aliasing effect among parameters by using integer characteristic frequencies and (2) the suitability for only models with independent parameters. In this paper, we synthesize the improvement to overcome the aliasing effect limitation [Tarantola S, Gatelli D, Mara TA. Random balance designs for the estimation of first order global sensitivity indices. Reliab Eng Syst Safety 2006; 91(6):717-27] and the improvement to overcome the independence limitation [Xu C, Gertner G. Extending a global sensitivity analysis technique to models with correlated parameters. Comput Stat Data Anal 2007, accepted for publication]. In this way, FAST can be a general first-order global sensitivity analysis method for linear/nonlinear models with as many correlated/uncorrelated parameters as the user specifies. We apply the general FAST to four test cases with correlated parameters. The results show that the sensitivity indices derived by the general FAST are in good agreement with the sensitivity indices derived by the correlation ratio method, which is a non-parametric method for models with correlated parameters

  18. Multigroup Moderation Test in Generalized Structured Component Analysis

    Directory of Open Access Journals (Sweden)

    Angga Dwi Mulyanto

    2016-05-01

    Full Text Available Generalized Structured Component Analysis (GSCA is an alternative method in structural modeling using alternating least squares. GSCA can be used for the complex analysis including multigroup. GSCA can be run with a free software called GeSCA, but in GeSCA there is no multigroup moderation test to compare the effect between groups. In this research we propose to use the T test in PLS for testing moderation Multigroup on GSCA. T test only requires sample size, estimate path coefficient, and standard error of each group that are already available on the output of GeSCA and the formula is simple so the user does not need a long time for analysis.

  19. Nonlinear analysis of generalized cross-field current instability

    International Nuclear Information System (INIS)

    Yoon, P.H.; Lui, A.T.Y.

    1993-01-01

    Analysis of the generalized cross-field current instability is carried out in which cross-field drift of both the ions and electrons and their temperatures are permitted to vary in time. The unstable mode under consideration is the electromagnetic generalization of the classical modified-two-stream instability. The generalized instability is made of the modified-two-stream and ion-Weibel modes. The relative importance of the features associated with the ion-Weibel mode and those of the modified-two-stream mode is assessed. Specific applications are made to the Earth's neutral sheet prior to substorm onset and to the Earth's bow shock. The numerical solution indicates that the ion-Weibel mode dominates in the Earth's neutral sheet environment. In contrast, the situation for the bow shock is dominated by the modified-two-stream mode. Notable differences are found between the present calculation and previous results on ion-Weibel mode which restrict the analysis to only parallel propagating waves. However, in the case of Earth's bow shock for which the ion-Weibel mode plays no important role, the inclusion of the electromagnetic ion response is found to differ little from the previous results which treats ions responding only to the electrostatic component of the excited waves

  20. Recovering Intrinsic Fragmental Vibrations Using the Generalized Subsystem Vibrational Analysis.

    Science.gov (United States)

    Tao, Yunwen; Tian, Chuan; Verma, Niraj; Zou, Wenli; Wang, Chao; Cremer, Dieter; Kraka, Elfi

    2018-05-08

    Normal vibrational modes are generally delocalized over the molecular system, which makes it difficult to assign certain vibrations to specific fragments or functional groups. We introduce a new approach, the Generalized Subsystem Vibrational Analysis (GSVA), to extract the intrinsic fragmental vibrations of any fragment/subsystem from the whole system via the evaluation of the corresponding effective Hessian matrix. The retention of the curvature information with regard to the potential energy surface for the effective Hessian matrix endows our approach with a concrete physical basis and enables the normal vibrational modes of different molecular systems to be legitimately comparable. Furthermore, the intrinsic fragmental vibrations act as a new link between the Konkoli-Cremer local vibrational modes and the normal vibrational modes.

  1. Analysis of General Power Counting Rules in Effective Field Theory

    CERN Document Server

    Gavela, B M; Manohar, A V; Merlo, L

    2016-01-01

    We derive the general counting rules for a quantum effective field theory (EFT) in $\\mathsf{d}$ dimensions. The rules are valid for strongly and weakly coupled theories, and predict that all kinetic energy terms are canonically normalized. They determine the energy dependence of scattering cross sections in the range of validity of the EFT expansion. The size of cross sections is controlled by the $\\Lambda$ power counting of EFT, not by chiral counting, even for chiral perturbation theory ($\\chi$PT). The relation between $\\Lambda$ and $f$ is generalized to $\\mathsf{d}$ dimensions. We show that the naive dimensional analysis $4\\pi$ counting is related to $\\hbar$ counting. The EFT counting rules are applied to $\\chi$PT, to Standard Model EFT and to the non-trivial case of Higgs EFT, which combines the $\\Lambda$ and chiral counting rules within a single theory.

  2. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    Ginevan, M.

    1984-09-01

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  3. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  4. Generalized direct Lyapunov method for the analysis of stability and attraction in general time systems

    International Nuclear Information System (INIS)

    Druzhinina, O V; Shestakov, A A

    2002-01-01

    A generalized direct Lyapunov method is put forward for the study of stability and attraction in general time systems of the following types: the classical dynamical system in the sense of Birkhoff, the general system in the sense of Zubov, the general system in the sense of Seibert, the general system with delay, and the general 'input-output' system. For such systems, with the help of generalized Lyapunov functions with respect to two filters, two quasifilters, or two filter bases, necessary and sufficient conditions for stability and attraction are obtained under minimal assumptions about the mathematical structure of the general system

  5. Generalized Multi-Edge Analysis for K-Edge Densitometry

    International Nuclear Information System (INIS)

    Collins, M.

    1998-01-01

    In K-edge densitometry (KED), a continuous-energy x-ray beam is transmitted through a liquid sample. The actinide content of the sample can be measured through analysis of the transmitted portion of the x-ray beam. Traditional methods for KED analysis allow the simultaneous calculation of, at most, two actinide concentrations. A generalized multi-edge KED analytical method is presented, allowing up to six actinide concentrations to be calculated simultaneously. Applications of this method for hybrid KED/x-ray fluorescence (HKED) systems are discussed. Current HKED systems require the operator to know the approximate actinide content of each sample, and manually select the proper analysis mode. The new multi-edge KED technique allows rapid identification of the major actinide components in a sample, independent of actinide content. The proper HKED analysis mode can be selected automatically, without requiring sample content information from the user. Automatic HKED analysis would be especially useful in an analytical laboratory setting, where samples with truly unknown characteristics are encountered. Because this technique requires no hardware modifications, several facilities that use HKED may eventually benefit from this approach

  6. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  7. A κ-generalized statistical mechanics approach to income analysis

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  8. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  9. Histerectomía total abdominal frente a histerectomía mínimamente invasiva: revisión sistemática y metaanálisis Total abdominal hysterectomy versus minimal-invasive hysterectomy: a systemic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Felipe Jorge Aragón Palmero

    2011-03-01

    three types of hysterectomies are used: the vaginal hysterectomy and the minimal-invasive hysterectomy (MIH. The objective of present research was to compare the MIH and the total abdominal hysterectomy (TAH in women presenting with benign uterine diseases. METHODS. A systemic review was made and a meta-analysis from the following databases: MEDLINE, EBSCO HOST AND The Cochrane Central Register of Controlled Trials. Only the controlled and randomized studies were selected. The data of all studies were combined and also the relative risk (RR with a 95% CI was used with the Mantel-Haenszel method as an effect measure for dichotomy variables. For the analysis of continuing variables the mean difference was used. In all the comparisons performed the results were obtained with the fix effect and randomized forms. RESULTS. A total of 53 transoperative complications were registered in the MIH hysterectomy versus 17 in the TAH group (RR: 1,78; 95% CI: 1,04-3.05. Postoperative complications evolved in a similar way in both groups without significant differences from the statistical point of view. The blood losses, the hospital stay and the patient's reincorporation to usual and work activities were lesser in the laparoscopy group; however, the operative time is higher when it is compared with TAH (mean difference: 37,36; 95% CI: 34,36-39,93. CONCLUSIONS. Both techniques have advantages and disadvantages. The indication of MIH must to be individualized according to the clinical situation of each patient and these not to be performed in those centers without a properly trained surgical staff and with experience in advanced minimal invasive surgery.

  10. Development of ocular hypertension secondary to tamponade with light versus heavy silicone oil: A systematic review

    Directory of Open Access Journals (Sweden)

    Vito Romano

    2015-01-01

    Full Text Available Aim: The intraocular silicone oil (SO tamponades used in the treatment of retinal detachment (RD have been associated with a difference ocular hypertension (OH rate. To clarify, if this complication was associated to use of standard SO (SSO versus heavy SO (HSO, we performed a systematic review and meta-analysis of comparative study between two kind of SO (standard or light vs. heavy for the treatment of RD and macular hole, without restriction to study design. Materials and Methods: The methodological quality of two randomized clinical trials (RCTs were evaluated using the criteria given in the Cochrane Handbook for Systematic Reviews of Intervention, while three non-RCTs were assessed with the Newcastle-Ottawa Scale and Strengthening the Reporting of Observational Studies in Epidemiology checklists. We calculated Mantel-Haenszel risk ratio (RR with 95% confidence intervals (95% CIs. The primary outcome was the rate of patients with OH treated with SSO compared to HSO. Results: There were a higher number of rates of OH in HSO compared to SSO. This difference was statistically significant with the fixed effect model (Mantel-Haenszel RR; 1.55; 95% CI, 1.06-2.28; P = 0.02 while there was not significative difference with the random effect model (Mantel-Haenszel RR; 1.51; 95% CI, 0.98-2.33; P = 0.06. Conclusion: We noted a trend that points out a higher OH rate in HSO group compared to SSO, but this finding, due to the small size and variable design of studies, needs to be confirmed in well-designed and large size RCTs.

  11. Cost analysis of robotic versus laparoscopic general surgery procedures.

    Science.gov (United States)

    Higgins, Rana M; Frelich, Matthew J; Bosler, Matthew E; Gould, Jon C

    2017-01-01

    Robotic surgical systems have been used at a rapidly increasing rate in general surgery. Many of these procedures have been performed laparoscopically for years. In a surgical encounter, a significant portion of the total costs is associated with consumable supplies. Our hospital system has invested in a software program that can track the costs of consumable surgical supplies. We sought to determine the differences in cost of consumables with elective laparoscopic and robotic procedures for our health care organization. De-identified procedural cost and equipment utilization data were collected from the Surgical Profitability Compass Procedure Cost Manager System (The Advisory Board Company, Washington, DC) for our health care system for laparoscopic and robotic cholecystectomy, fundoplication, and inguinal hernia between the years 2013 and 2015. Outcomes were length of stay, case duration, and supply cost. Statistical analysis was performed using a t-test for continuous variables, and statistical significance was defined as p robotic procedures. Length of stay did not differ for fundoplication or cholecystectomy. Length of stay was greater for robotic inguinal hernia repair. Case duration was similar for cholecystectomy (84.3 robotic and 75.5 min laparoscopic, p = 0.08), but significantly longer for robotic fundoplication (197.2 robotic and 162.1 min laparoscopic, p = 0.01) and inguinal hernia repair (124.0 robotic and 84.4 min laparoscopic, p = ≪0.01). We found a significantly increased cost of general surgery procedures for our health care system when cases commonly performed laparoscopically are instead performed robotically. Our analysis is limited by the fact that we only included costs associated with consumable surgical supplies. The initial acquisition cost (over $1 million for robotic surgical system), depreciation, and service contract for the robotic and laparoscopic systems were not included in this analysis.

  12. Generalized causal mediation and path analysis: Extensions and practical considerations.

    Science.gov (United States)

    Albert, Jeffrey M; Cho, Jang Ik; Liu, Yiying; Nelson, Suchitra

    2018-01-01

    Causal mediation analysis seeks to decompose the effect of a treatment or exposure among multiple possible paths and provide casually interpretable path-specific effect estimates. Recent advances have extended causal mediation analysis to situations with a sequence of mediators or multiple contemporaneous mediators. However, available methods still have limitations, and computational and other challenges remain. The present paper provides an extended causal mediation and path analysis methodology. The new method, implemented in the new R package, gmediation (described in a companion paper), accommodates both a sequence (two stages) of mediators and multiple mediators at each stage, and allows for multiple types of outcomes following generalized linear models. The methodology can also handle unsaturated models and clustered data. Addressing other practical issues, we provide new guidelines for the choice of a decomposition, and for the choice of a reference group multiplier for the reduction of Monte Carlo error in mediation formula computations. The new method is applied to data from a cohort study to illuminate the contribution of alternative biological and behavioral paths in the effect of socioeconomic status on dental caries in adolescence.

  13. Vehicle technology under CO2 constraint: a general equilibrium analysis

    International Nuclear Information System (INIS)

    Schaefer, Andreas; Jacoby, Henry D.

    2006-01-01

    A study is presented of the rates of penetration of different transport technologies under policy constraints on CO 2 emissions. The response of this sector is analyzed within an overall national level of restriction, with a focus on automobiles, light trucks, and heavy freight trucks. Using the US as an example, a linked set of three models is used to carry out the analysis: a multi-sector computable general equilibrium model of the economy, a MARKAL-type model of vehicle and fuel supply technology, and a model simulating the split of personal and freight transport among modes. Results highlight the importance of incremental improvements in conventional internal combustion engine technology, and, in the absence of policies to overcome observed consumer discount rates, the very long time horizons before radical alternatives like the internal combustion engine hybrid drive train vehicle are likely to take substantial market share

  14. Stability analysis of embedded nonlinear predictor neural generalized predictive controller

    Directory of Open Access Journals (Sweden)

    Hesham F. Abdel Ghaffar

    2014-03-01

    Full Text Available Nonlinear Predictor-Neural Generalized Predictive Controller (NGPC is one of the most advanced control techniques that are used with severe nonlinear processes. In this paper, a hybrid solution from NGPC and Internal Model Principle (IMP is implemented to stabilize nonlinear, non-minimum phase, variable dead time processes under high disturbance values over wide range of operation. Also, the superiority of NGPC over linear predictive controllers, like GPC, is proved for severe nonlinear processes over wide range of operation. The necessary conditions required to stabilize NGPC is derived using Lyapunov stability analysis for nonlinear processes. The NGPC stability conditions and improvement in disturbance suppression are verified by both simulation using Duffing’s nonlinear equation and real-time using continuous stirred tank reactor. Up to our knowledge, the paper offers the first hardware embedded Neural GPC which has been utilized to verify NGPC–IMP improvement in realtime.

  15. General analysis of slab lasers using geometrical optics.

    Science.gov (United States)

    Chung, Te-yuan; Bass, Michael

    2007-02-01

    A thorough and general geometrical optics analysis of a slab-shaped laser gain medium is presented. The length and thickness ratio is critical if one is to achieve the maximum utilization of absorbed pump power by the laser light in such a medium; e.g., the fill factor inside the slab is to be maximized. We point out that the conditions for a fill factor equal to 1, laser light entering and exiting parallel to the length of the slab, and Brewster angle incidence on the entrance and exit faces cannot all be satisfied at the same time. Deformed slabs are also studied. Deformation along the width direction of the largest surfaces is shown to significantly reduce the fill factor that is possible.

  16. Inverse dynamic analysis of general n-link robot manipulators

    International Nuclear Information System (INIS)

    Yih, T.C.; Wang, T.Y.; Burks, B.L.; Babcock, S.M.

    1996-01-01

    In this paper, a generalized matrix approach is derived to analyze the dynamic forces and moments (torques) required by the joint actuators. This method is general enough to solve the problems of any n-link open-chain robot manipulators with joint combinations of R(revolute), P(prismatic), and S(spherical). On the other hand, the proposed matrix solution is applicable to both nonredundant and redundant robotic systems. The matrix notation is formulated based on the Newton-Euler equations under the condition of quasi-static equilibrium. The 4 x 4 homogeneous cylindrical coordinates-Bryant angles (C-B) notation is applied to model the robotic systems. Displacements, velocities, and accelerations of each joint and link center of gravity (CG) are calculated through kinematic analysis. The resultant external forces and moments exerted on the CG of each link are considered as known inputs. Subsequently, a 6n x 6n displacement coefficient matrix and a 6n x 1 external force/moment vector can be established. At last, the joint forces and moments needed for the joint actuators to control the robotic system are determined through matrix inversion. Numerical examples will be illustrated for the nonredundant industrial robots: Bendix AA/CNC (RRP/RRR) and Unimate 2000 spherical (SP/RRR) robots; and the redundant light duty utility arm (LDUA), modified LDUA, and tank waste retrieval manipulator system

  17. Gerald: a general environment for radiation analysis and design

    International Nuclear Information System (INIS)

    Boyle, Ch.; Oliveira, P.I.E. de; Oliveira, C.R.E. de; Adams, M.L.; Galan, J.M.

    2005-01-01

    Full text of publication follows: This paper describes the status of the GERALD interactive workbench for the analysis of radiation transport problems. GERALD basically guides the user through the various steps that are necessary to solve a radiation transport problem, and is aimed at education, research and industry. The advantages of such workbench are many: quality assurance of problem setup, interaction of the user with problem solution, preservation of theory and legacy research codes, and rapid proto-typing and testing of new methods. The environment is of general applicability catering for analytical, deterministic and stochastic analysis of the radiation problem and is not tied to one specific solution method or code. However, GERALD is being developed as a portable, modular, open source framework which renders itself quite naturally to the coupling of existing computational tools through specifically developed plug-ins. By offering a common route for setting up, solving and analyzing radiation transport problems GERALD offers the possibility of methods intercomparison and validation. Such flexible radiation transport environment will also facilitate the coupling of radiation physics methods to other physical phenomena and their application to other areas of application such as medical physics and the environment. (authors)

  18. Generalization error analysis: deep convolutional neural network in mammography

    Science.gov (United States)

    Richter, Caleb D.; Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir; Cha, Kenny

    2018-02-01

    We conducted a study to gain understanding of the generalizability of deep convolutional neural networks (DCNNs) given their inherent capability to memorize data. We examined empirically a specific DCNN trained for classification of masses on mammograms. Using a data set of 2,454 lesions from 2,242 mammographic views, a DCNN was trained to classify masses into malignant and benign classes using transfer learning from ImageNet LSVRC-2010. We performed experiments with varying amounts of label corruption and types of pixel randomization to analyze the generalization error for the DCNN. Performance was evaluated using the area under the receiver operating characteristic curve (AUC) with an N-fold cross validation. Comparisons were made between the convergence times, the inference AUCs for both the training set and the test set of the original image patches without corruption, and the root-mean-squared difference (RMSD) in the layer weights of the DCNN trained with different amounts and methods of corruption. Our experiments observed trends which revealed that the DCNN overfitted by memorizing corrupted data. More importantly, this study improved our understanding of DCNN weight updates when learning new patterns or new labels. Although we used a specific classification task with the ImageNet as example, similar methods may be useful for analysis of the DCNN learning processes, especially those that employ transfer learning for medical image analysis where sample size is limited and overfitting risk is high.

  19. Refined generalized multiscale entropy analysis for physiological signals

    Science.gov (United States)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  20. SNAP: A General Purpose Network Analysis and Graph Mining Library.

    Science.gov (United States)

    Leskovec, Jure; Sosič, Rok

    2016-10-01

    Large networks are becoming a widely used abstraction for studying complex systems in a broad set of disciplines, ranging from social network analysis to molecular biology and neuroscience. Despite an increasing need to analyze and manipulate large networks, only a limited number of tools are available for this task. Here, we describe Stanford Network Analysis Platform (SNAP), a general-purpose, high-performance system that provides easy to use, high-level operations for analysis and manipulation of large networks. We present SNAP functionality, describe its implementational details, and give performance benchmarks. SNAP has been developed for single big-memory machines and it balances the trade-off between maximum performance, compact in-memory graph representation, and the ability to handle dynamic graphs where nodes and edges are being added or removed over time. SNAP can process massive networks with hundreds of millions of nodes and billions of edges. SNAP offers over 140 different graph algorithms that can efficiently manipulate large graphs, calculate structural properties, generate regular and random graphs, and handle attributes and meta-data on nodes and edges. Besides being able to handle large graphs, an additional strength of SNAP is that networks and their attributes are fully dynamic, they can be modified during the computation at low cost. SNAP is provided as an open source library in C++ as well as a module in Python. We also describe the Stanford Large Network Dataset, a set of social and information real-world networks and datasets, which we make publicly available. The collection is a complementary resource to our SNAP software and is widely used for development and benchmarking of graph analytics algorithms.

  1. General tensor discriminant analysis and gabor features for gait recognition.

    Science.gov (United States)

    Tao, Dacheng; Li, Xuelong; Wu, Xindong; Maybank, Stephen J

    2007-10-01

    The traditional image representations are not suited to conventional classification methods, such as the linear discriminant analysis (LDA), because of the under sample problem (USP): the dimensionality of the feature space is much higher than the number of training samples. Motivated by the successes of the two dimensional LDA (2DLDA) for face recognition, we develop a general tensor discriminant analysis (GTDA) as a preprocessing step for LDA. The benefits of GTDA compared with existing preprocessing methods, e.g., principal component analysis (PCA) and 2DLDA, include 1) the USP is reduced in subsequent classification by, for example, LDA; 2) the discriminative information in the training tensors is preserved; and 3) GTDA provides stable recognition rates because the alternating projection optimization algorithm to obtain a solution of GTDA converges, while that of 2DLDA does not. We use human gait recognition to validate the proposed GTDA. The averaged gait images are utilized for gait representation. Given the popularity of Gabor function based image decompositions for image understanding and object recognition, we develop three different Gabor function based image representations: 1) the GaborD representation is the sum of Gabor filter responses over directions, 2) GaborS is the sum of Gabor filter responses over scales, and 3) GaborSD is the sum of Gabor filter responses over scales and directions. The GaborD, GaborS and GaborSD representations are applied to the problem of recognizing people from their averaged gait images.A large number of experiments were carried out to evaluate the effectiveness (recognition rate) of gait recognition based on first obtaining a Gabor, GaborD, GaborS or GaborSD image representation, then using GDTA to extract features and finally using LDA for classification. The proposed methods achieved good performance for gait recognition based on image sequences from the USF HumanID Database. Experimental comparisons are made with nine

  2. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate...... if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory profile. 3. Generalized Procrustes Analysis is used to analyze the results. It is a relatively new technique...

  3. Creating Lasting Behavioral Change through the Generalization Analysis Worksheet

    Science.gov (United States)

    Brady, John; Kotkin, Ron

    2011-01-01

    The goal of any behavioral program is to facilitate lasting change. A significant criticism of behavioral programs is that they work in the clinical setting but do not generalize once the clinical program is stopped. The authors suggest that behavioral programs often do not generalize because clinicians fail to plan for generalization to occur…

  4. Sampling in forests for radionuclide analysis. General and practical guidance

    Energy Technology Data Exchange (ETDEWEB)

    Aro, Lasse (Finnish Forest Research Inst. (METLA) (Finland)); Plamboeck, Agneta H. (Swedish Defence Research Agency (FOI) (Sweden)); Rantavaara, Aino; Vetikko, Virve (Radiation and Nuclear Safety Authority (STUK) (Finland)); Straalberg, Elisabeth (Inst. Energy Technology (IFE) (Norway))

    2009-01-15

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  5. Sampling in forests for radionuclide analysis. General and practical guidance

    International Nuclear Information System (INIS)

    Aro, Lasse; Plamboeck, Agneta H.; Rantavaara, Aino; Vetikko, Virve; Straelberg, Elisabeth

    2009-01-01

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  6. Admixture analysis of age of onset in generalized anxiety disorder.

    Science.gov (United States)

    Rhebergen, Didi; Aderka, Idan M; van der Steenstraten, Ira M; van Balkom, Anton J L M; van Oppen, Patricia; Stek, Max L; Comijs, Hannie C; Batelaan, Neeltje M

    2017-08-01

    Age of onset is a marker of clinically relevant subtypes in various medical and psychiatric disorders. Past research has also reported that age of onset in generalized anxiety disorder (GAD) is clinically significant; but, in research to date, arbitrary cut-off ages have been used. In the present study, admixture analysis was used to determine the best fitting model for age of onset distribution in GAD. Data were derived from 459 adults with a diagnosis of GAD who took part in the Netherlands Study of Depression and Anxiety (NESDA). Associations between age of onset subtypes, identified by admixture analysis, and sociodemographic, clinical, and vulnerability factors were examined using univariate tests and multivariate logistic regression analyses. Two age of onset distributions were identified: an early-onset group (24 years of age and younger) and a late-onset group (greater than 24 years of age). Multivariate analysis revealed that early-onset GAD was associated with female gender (OR 2.1 (95%CI 1.4-3.2)), higher education (OR 1.1 (95%CI 1.0-1.2)), and higher neuroticism (OR 1.4 (95%CI 1.1-1.7)), while late-onset GAD was associated with physical illnesses (OR 1.3 (95%CI 1.1-1.7)). Study limitations include the possibility of recall bias given that age of onset was assessed retrospectively, and an inability to detect a possible very-late-onset GAD subtype. Collectively, the results of the study indicate that GAD is characterized by a bimodal age of onset distribution with an objectively determined early cut-off at 24 years of age. Early-onset GAD is associated with unique factors that may contribute to its aetiology; but, it does not constitute a more severe subtype compared to late-onset GAD. Future research should use 24 years of age as the cut-off for early-onset GAD to when examining the clinical relevance of age of onset for treatment efficacy and illness course. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Time spent in outdoor activities in relation to myopia prevention and control: a meta-analysis and systematic review.

    Science.gov (United States)

    Xiong, Shuyu; Sankaridurg, Padmaja; Naduvilath, Thomas; Zang, Jiajie; Zou, Haidong; Zhu, Jianfeng; Lv, Minzhi; He, Xiangui; Xu, Xun

    2017-09-01

    Outdoor time is considered to reduce the risk of developing myopia. The purpose is to evaluate the evidence for association between time outdoors and (1) risk of onset of myopia (incident/prevalent myopia); (2) risk of a myopic shift in refractive error and c) risk of progression in myopes only. A systematic review followed by a meta-analysis and a dose-response analysis of relevant evidence from literature was conducted. PubMed, EMBASE and the Cochrane Library were searched for relevant papers. Of the 51 articles with relevant data, 25 were included in the meta-analysis and dose-response analysis. Twenty-three of the 25 articles involved children. Risk ratio (RR) for binary variables and weighted mean difference (WMD) for continuous variables were conducted. Mantel-Haenszel random-effects model was used to pool the data for meta-analysis. Statistical heterogeneity was assessed using the I 2 test with I 2  ≥ 50% considered to indicate high heterogeneity. Additionally, subgroup analyses (based on participant's age, prevalence of myopia and study type) and sensitivity analyses were conducted. A significant protective effect of outdoor time was found for incident myopia (clinical trials: risk ratio (RR) = 0.536, 95% confidence interval (CI) = 0.338 to 0.850; longitudinal cohort studies: RR = 0.574, 95% CI = 0.395 to 0.834) and prevalent myopia (cross-sectional studies: OR = 0.964, 95% CI = 0.945 to 0.982). With dose-response analysis, an inverse nonlinear relationship was found with increased time outdoors reducing the risk of incident myopia. Also, pooled results from clinical trials indicated that when outdoor time was used as an intervention, there was a reduced myopic shift of -0.30 D (in both myopes and nonmyopes) compared with the control group (WMD = -0.30, 95% CI = -0.18 to -0.41) after 3 years of follow-up. However, when only myopes were considered, dose-response analysis did not find a relationship between time outdoors and myopic

  8. Nonparametric tests for equality of psychometric functions.

    Science.gov (United States)

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2017-12-07

    Many empirical studies measure psychometric functions (curves describing how observers' performance varies with stimulus magnitude) because these functions capture the effects of experimental conditions. To assess these effects, parametric curves are often fitted to the data and comparisons are carried out by testing for equality of mean parameter estimates across conditions. This approach is parametric and, thus, vulnerable to violations of the implied assumptions. Furthermore, testing for equality of means of parameters may be misleading: Psychometric functions may vary meaningfully across conditions on an observer-by-observer basis with no effect on the mean values of the estimated parameters. Alternative approaches to assess equality of psychometric functions per se are thus needed. This paper compares three nonparametric tests that are applicable in all situations of interest: The existing generalized Mantel-Haenszel test, a generalization of the Berry-Mielke test that was developed here, and a split variant of the generalized Mantel-Haenszel test also developed here. Their statistical properties (accuracy and power) are studied via simulation and the results show that all tests are indistinguishable as to accuracy but they differ non-uniformly as to power. Empirical use of the tests is illustrated via analyses of published data sets and practical recommendations are given. The computer code in MATLAB and R to conduct these tests is available as Electronic Supplemental Material.

  9. Analysis of Financial Ratio to Distinguish Indonesia Joint Venture General Insurance Company Performance using Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Subiakto Soekarno

    2012-01-01

    Full Text Available Insurance industry stands as a service business that plays a significant role in Indonesiaeconomical condition. The development of insurance industry in Indonesia, both of generalinsurance and life insurance, has increased very fast. The general insurance industry itselfdivided into two major players which are local private company and Joint Venture Company.Lately, the use of statistical techniques and financial ratios models to asses financial institutionsuch as insurance company have been used as one of the appropriate combination inpredicting the performance of an industry. This research aims to distinguish between JointVenture General Insurance Companies that have a good performance and those who are lessperforming well using Discriminant Analysis. Further, the findings led that DiscriminantAnalysis is able to distinguish Joint Venture General Insurance Companies that have a goodperformance and those who are not performing well. There are also six ratios which are RBC,Technical Reserve to Investment Ratio, Debt Ratio, Return on Equity, Loss Ratio, and ExpenseRatio that stand as the most influential ratios to distinguish the performance of joint venturegeneral insurance companies. In addition, the result suggest business people to be concernedtoward those six ratios, to increase their companies’ performance.Key words: general insurance, financial ratio, discriminant analysis

  10. Analysis of generalized Schwarz alternating procedure for domain decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Engquist, B.; Zhao, Hongkai [Univ. of California, Los Angeles, CA (United States)

    1996-12-31

    The Schwartz alternating method(SAM) is the theoretical basis for domain decomposition which itself is a powerful tool both for parallel computation and for computing in complicated domains. The convergence rate of the classical SAM is very sensitive to the overlapping size between each subdomain, which is not desirable for most applications. We propose a generalized SAM procedure which is an extension of the modified SAM proposed by P.-L. Lions. Instead of using only Dirichlet data at the artificial boundary between subdomains, we take a convex combination of u and {partial_derivative}u/{partial_derivative}n, i.e. {partial_derivative}u/{partial_derivative}n + {Lambda}u, where {Lambda} is some {open_quotes}positive{close_quotes} operator. Convergence of the modified SAM without overlapping in a quite general setting has been proven by P.-L.Lions using delicate energy estimates. The important questions remain for the generalized SAM. (1) What is the most essential mechanism for convergence without overlapping? (2) Given the partial differential equation, what is the best choice for the positive operator {Lambda}? (3) In the overlapping case, is the generalized SAM superior to the classical SAM? (4) What is the convergence rate and what does it depend on? (5) Numerically can we obtain an easy to implement operator {Lambda} such that the convergence is independent of the mesh size. To analyze the convergence of the generalized SAM we focus, for simplicity, on the Poisson equation for two typical geometry in two subdomain case.

  11. An applied general equilibrium model for Dutch agribusiness policy analysis

    NARCIS (Netherlands)

    Peerlings, J.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of

  12. Municipal solid waste management problems: an applied general equilibrium analysis

    NARCIS (Netherlands)

    Bartelings, H.

    2003-01-01

    Keywords: Environmental policy; General equilibrium modeling; Negishi format; Waste management policies; Market distortions.

    About 40% of the entire budget spent on environmental problems in the

  13. Specific Cooperative Analysis and Design in General Hypermedia Development

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Mogensen, Preben Holst

    1994-01-01

    activities. We demonstrate how these activities informed the general hypermedia framework and application design. Use scenarios and prototypes with example data from the users’ daily work were used as sources both to trigger design ideas and new insights regarding work practice. Mutual challenging...

  14. Analysis and design of generalized BICM-T system

    KAUST Repository

    Malik, Muhammad Talha; Hossain, Md Jahangir; Alouini, Mohamed-Slim

    2014-01-01

    -T). In this paper, we analyze a generalized BICM-T system that uses a nonequally spaced signal constellation in conjunction with a bit-level multiplexer in an additive white Gaussian noise (AWGN) channel. As such, one can exploit the full benefit of BICM

  15. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  16. 40 CFR 265.13 - General waste analysis.

    Science.gov (United States)

    2010-07-01

    ... 265.13 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED... waste analysis requirements for specific waste management methods as specified in §§ 265.200, 265.225... analysis of test data; and, (iii) The annual removal of residues which are not delisted under § 260.22 -of...

  17. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    Science.gov (United States)

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  18. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  19. [Robotics in general surgery: personal experience, critical analysis and prospectives].

    Science.gov (United States)

    Fracastoro, Gerolamo; Borzellino, Giuseppe; Castelli, Annalisa; Fiorini, Paolo

    2005-01-01

    Today mini invasive surgery has the chance to be enhanced with sophisticated informative systems (Computer Assisted Surgery, CAS) like robotics, tele-mentoring and tele-presence. ZEUS and da Vinci, present in more than 120 Centres in the world, have been used in many fields of surgery and have been tested in some general surgical procedures. Since the end of 2003, we have performed 70 experimental procedures and 24 operations of general surgery with ZEUS robotic system, after having properly trained 3 surgeons and the operating room staff. Apart from the robot set-up, the mean operative time of the robotic operations was similar to the laparoscopic ones; no complications due to robotic technique occurred. The Authors report benefits and disadvantages related to robots' utilization, problems still to be solved and the possibility to make use of them with tele-surgery, training and virtual surgery.

  20. World Oil Price and Biofuels : A General Equilibrium Analysis

    OpenAIRE

    Timilsina, Govinda R.; Mevel, Simon; Shrestha, Ashish

    2011-01-01

    The price of oil could play a significant role in influencing the expansion of biofuels. However, this issue has not been fully investigated yet in the literature. Using a global computable general equilibrium model, this study analyzes the impact of oil price on biofuel expansion, and subsequently, on food supply. The study shows that a 65 percent increase in oil price in 2020 from the 20...

  1. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  2. Funcionamiento diferencial del item en la evaluación internacional PISA. Detección y comprensión. [Differential Item Functioning in the PISA Project: Detection and Understanding

    Directory of Open Access Journals (Sweden)

    Paula Elosua

    2006-08-01

    Full Text Available This report analyses the differential item functioning (DIF in the Programme for Indicators of Student Achievement PISA2000. The items studied are coming from the Reading Comprehension Test. We analyzed the released items from this year because we wanted to join the detection of DIF and its understanding. The reference group is the sample of United Kingdom and the focal group is the Spanish sample. The procedures of detection are Mantel-Haenszel, Logistic Regression and the standardized mean difference, and their extensions for polytomous items. Two items were flagged and the post-hoc analysis didn’t explain the causes of DIF entirely. Este trabajo analiza el funcionamiento diferencial del ítem (FDI de la prueba de comprensión lectora de la evaluación PISA2000 entre la muestras del Reino Unido y España. Se estudian los ítems liberados con el fin de aunar las fases de detección del FDI con la comprensión de sus causas. En la fase de detección se comparan los resultados de los procedimientos Mantel-Haenszel, Regresión Logística y Medias Estandarizadas en sus versiones para ítems dicotómicos y politómicos. Los resultados muestran que dos ítems presentan funcionamiento diferencial aunque el estudio post-hoc llevado a cabo sobre su contenido no ha podido precisar sus causas.

  3. Chronic hypertension and the risk for adverse pregnancy outcome after superimposed pre-eclampsia.

    Science.gov (United States)

    Vanek, M; Sheiner, E; Levy, A; Mazor, M

    2004-07-01

    To determine the risk factors and pregnancy outcome of patients with chronic hypertension during pregnancy after controlling for superimposed preeclampsia. A comparison of all singleton term (>36 weeks) deliveries occurring between 1988 and 1999, with and without chronic hypertension, was performed. Stratified analyses, using the Mantel-Haenszel technique, and a multiple logistic regression model were performed to control for confounders. Chronic hypertension complicated 1.6% (n=1807) of all deliveries included in the study (n=113156). Using a multivariable analysis, the following factors were found to be independently associated with chronic hypertension: maternal age >40 years (OR=3.1; 95% CI 2.7-3.6), diabetes mellitus (OR=3.6; 95% CI 3.3-4.1), recurrent abortions (OR=1.5; 95% CI 1.3-1.8), infertility treatment (OR=2.9; 95% CI 2.3-3.7), and previous cesarean delivery (CD; OR=1.8 CI 1.6-2.0). After adjustment for superimposed preeclampsia, using the Mantel-Haenszel technique, pregnancies complicated with chronic hypertension had higher rates of CD (OR=2.7; 95% CI 2.4-3.0), intra uterine growth restriction (OR=1.7; 95% CI 1.3-2.2), perinatal mortality (OR=1.6; 95% CI 1.01-2.6) and post-partum hemorrhage (OR=2.2; 95% CI 1.4-3.7). Chronic hypertension is associated with adverse pregnancy outcome, regardless of superimposed preeclampsia.

  4. Using general-purpose compression algorithms for music analysis

    DEFF Research Database (Denmark)

    Louboutin, Corentin; Meredith, David

    2016-01-01

    General-purpose compression algorithms encode files as dictionaries of substrings with the positions of these strings’ occurrences. We hypothesized that such algorithms could be used for pattern discovery in music. We compared LZ77, LZ78, Burrows–Wheeler and COSIATEC on classifying folk song...... in the input data, COSIATEC outperformed LZ77 with a mean F1 score of 0.123, compared with 0.053 for LZ77. However, when the music was processed a voice at a time, the F1 score for LZ77 more than doubled to 0.124. We also discovered a significant correlation between compression factor and F1 score for all...

  5. Probabilistic structural analysis using a general purpose finite element program

    Science.gov (United States)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  6. General Nature of Multicollinearity in Multiple Regression Analysis.

    Science.gov (United States)

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  7. 40 CFR 264.13 - General waste analysis.

    Science.gov (United States)

    2010-07-01

    ... 264.13 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED... waste management methods as specified in §§ 264.17, 264.314, 264.341, 264.1034(d), 264.1063(d), 264.1083... analysis of test data; and, (iii) The annual removal of residues which are not delisted under § 260.22 of...

  8. Seismic risk analysis for General Electric Plutonium Facility, Pleasanton, California

    International Nuclear Information System (INIS)

    1978-01-01

    This report presents the results of a seismic risk analysis that focuses on all possible sources of seismic activity, with the exception of the postulated Verona Fault. The best estimate curve indicates that the Vallecitos facility will experience 30% g with a return period of roughly 130 years and 60% g with a return period of roughly 700 years

  9. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  10. The English version of the four-dimensional symptom questionnaire (4DSQ) measures the same as the original Dutch questionnaire: a validation study.

    Science.gov (United States)

    Terluin, Berend; Smits, Niels; Miedema, Baukje

    2014-12-01

    Translations of questionnaires need to be carefully validated to assure that the translation measures the same construct(s) as the original questionnaire. The four-dimensional symptom questionnaire (4DSQ) is a Dutch self-report questionnaire measuring distress, depression, anxiety and somatization. To evaluate the equivalence of the English version of the 4DSQ. 4DSQ data of English and Dutch speaking general practice attendees were analysed and compared. The English speaking group consisted of 205 attendees, aged 18-64 years, in general practice, in Canada whereas the Dutch group consisted of 302 general practice attendees in the Netherlands. Differential item functioning (DIF) analysis was conducted using the Mantel-Haenszel method and ordinal logistic regression. Differential test functioning (DTF; i.e., the scale impact of DIF) was evaluated using linear regression analysis. DIF was detected in 2/16 distress items, 2/6 depression items, 2/12 anxiety items, and 1/16 somatization items. With respect to mean scale scores, the impact of DIF on the scale level was negligible for all scales. On the anxiety scale DIF caused the English speaking patients with moderate to severe anxiety to score about one point lower than Dutch patients with the same anxiety level. The English 4DSQ measures the same constructs like the original Dutch 4DSQ. The distress, depression and somatization scales can employ the same cut-off points as the corresponding Dutch scales. However, cut-off points of the English 4DSQ anxiety scale should be lowered by one point to retain the same meaning as the Dutch anxiety cut-off points.

  11. AN ANALYSIS OF MALIGNANCIES PRESENTING AS ACUTE GENERAL SURGICAL EMERGENCIES

    Directory of Open Access Journals (Sweden)

    Kannan Ross

    2017-02-01

    Full Text Available BACKGROUND Malignancies in the setting of acute general surgical emergencies are rare to present. The commonly presenting malignancies to the general surgeon in emergency conditions are perforation, obstruction, haemorrhage or urinary retention. Though their incidence when compared to benign conditions presenting with same clinical presentations are rare, they should never be neglected. The general surgeon must be aware of such presentations and hereby decide the management and follow up according to the malignancy he encounters on the operation theatre. The management should aim at radical procedures and regular follow up if needed with chemotherapy or radiotherapy and also should be well informed of the morbidity and mortality following intervention considering the malignancy grade, age of patient, duration of presentation and co-morbid conditions. MATERIALS AND METHODS In this study, we consider all patients taken up in emergency operative procedures, study their findings on operation theatre, correlate with their biopsy report for any malignancy and follow up during their immediate postop up to <30 days and also late post beyond the procedure and bring about the incidence, common modes of presentation, malignancies encountered, age and sex distribution and the perioperative morbidity and mortality rates of the those malignancies. RESULTS The incidence of malignancies presenting as acute abdominal emergencies in this study was found to be around 8.27%. The number of males who presented with such malignancies outnumbered females in a significant manner in the ratio 1.6:1. Among the malignancies, gastric (25% and colonic malignancies (59.38% were the most common. Perforation was the only presentation as acute emergency in carcinoma stomach. Incidence of malignancy in gastric perforation was 57.14% when compared to that reported by Emer Ergul et al that about 10-16% of all gastric perforations are caused by gastric carcinoma. 11 Perioperative

  12. Geal: A general program for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    Garcia-Torano, E.; Acena Barrenechea, M.L.

    1978-01-01

    A computing program for analysis and representation of alpha spectra obtained with surface barrier detectors is described. Several methods for fitting spectra are studied. A monoenergetic line or a doublet previously fitted has been used as a standard for the analyses of all kind of spectra. Some examples of application as well as a list of the program are shown. The program has been written in Fortran V language. (author)

  13. Association between incision technique for hamstring tendon harvest in anterior cruciate ligament reconstruction and the risk of injury to the infra-patellar branch of the saphenous nerve: a meta-analysis.

    Science.gov (United States)

    Grassi, Alberto; Perdisa, Francesco; Samuelsson, Kristian; Svantesson, Eleonor; Romagnoli, Matteo; Raggi, Federico; Gaziano, Teide; Mosca, Massimiliano; Ayeni, Olufemi; Zaffagnini, Stefano

    2018-02-08

    To determine how the incision technique for hamstring tendon (HT) harvest in anterior cruciate ligament (ACL) reconstruction affects the risk of injury to the IPBSN and clinical outcome. A systematic literature search of the MEDLINE/Pubmed, Cochrane Central Register of Controlled Trials (CENTRAL) and EBSCOhost electronic databases and clinicaltrials.gov for unpublished studies was performed to identify comparative studies investigating injury to the IPBSN after HT ACL reconstruction by comparing at least two different incision techniques. Data were extracted for the number of patients with evidence of any neurologic deficit corresponding to injury to the IPBSN, area of sensory deficit, the Lysholm score and patient satisfaction. The mean difference (MD) in study outcome between incision groups was assessed. The relative risk (RR) and the number needed to treat (NNT) were calculated. The Chi-square and Higgins' I 2 tests were applied to test heterogeneity. Data were pooled using a Mantel-Haenszel random-effects model if the statistical heterogeneity was > 50% and a fixed-effects model if the statistical heterogeneity was < 50%. The risk of bias was evaluated according to the Cochrane Database questionnaire and the quality of evidence was graded according to the Grading of Recommendations Assessment, Development and Evaluation (GRADE) guidelines. A total of eight studies (three randomized controlled trials (RCTs) and five comparative studies) were included, of which six compared vertical and oblique incisions, one horizontal and vertical incisions, and one compared all three techniques. HT harvest was performed through a vertical incision in 329 patients, through an oblique incision in 195 patients and through a horizontal incision in 151 patients. Considering the meta-analysis of the RCTs, the performance of a vertical incision significantly increased the risk of causing IPBSN deficiency compared with both oblique and horizontal incision [RR 1.65 (CI 1

  14. General gastroscopy of gastroesophageal reflux disease: Analysis of 4086 cases

    Directory of Open Access Journals (Sweden)

    Zhi-wei HU

    2018-04-01

    Full Text Available Objective To analyze the characteristics of gastroesophageal reflux disease (GERD under general gastroscope. Methods The detection rates of GERD related abnormalities such as esophagitis, Barrett esophagus and hiatal hernia under the first gastroscopy of the adult GERD patients from January 2013 to January 2017 in our center and the statistical relationship between the abnormal findings were analyzed retrospectively. Results A total of 4086 GERD patients, 2004 males and 2082 females, were included in this study, and the age was 18-89(50.4±13.3 years old. The detection rate of non erosive GERD was 78.7%, esophagitis 21.3%; non Barrett esophagus 87.7%, suspected Barrett esophagus 8.3%, Barrett esophagus 3.9%; generally normal cardia 61.4%, short segment hiatus hernia 20.4%, and long segment hiatal hernia 18.2%. The detection rates of esophagitis showed statistically significant differences (P0.013. Comparing the three age groups of 18-39, 40-59 and ≥60 years old, the detection rate of hiatal hernia was significantly higher in the group of ≥60 years old than in the 18-39 and 40-59 years old groups (P=0.007, while there was no significant difference (P>0.013 between the 18-39 and 40-59 years old groups. The detection rate of esophagitis was significantly higher in ≥60 years old group than in 18-39 and 40-59 years old groups (P=0.004, P=0.008, while no significant statistically difference (P>0.013 was found between the later two groups. Conclusions Gastroscopy can be used as a basic examination means for GERD; short segment hiatal hernia can be regarded as an early form of hiatal hernia, and is of importantreference value for the diagnosis and treatment of GERD; more serious hiatal hernia and esophagitis could be found in elderly GERD patients. DOI: 10.11855/j.issn.0577-7402.2018.01.08

  15. CFD and thermal analysis applications at General Motors

    International Nuclear Information System (INIS)

    Johnson, J.P.

    2002-01-01

    The presentation will include a brief history of the growth of CFD and thermal analysis in GM's vehicle program divisions. Its relationship to the underlying computer infrastructure will be sketched. Application results will be presented for calculations in aerodynamics, flow through heat exchangers, engine compartment thermal studies, HVAC systems and others. Current technical challenges will be outlined including grid generation, turbulence modeling, heat transfer, and solution algorithms. The introduction of CFD and heat transfer results into Virtual Vehicle Reviews, and its potential impact on a company's CAE infrastructure will be noted. Finally, some broad comments will be made on the management of CFD and heat transfer technology across a global corporate enterprise. (author)

  16. Analysis and design of generalized BICM-T system

    KAUST Repository

    Malik, Muhammad Talha

    2014-09-01

    The performance of bit-interleaved coded modulation (BICM) using convolutional codes in nonfading channels can be significantly improved if the coded bits are not interleaved at all. This particular BICM system is referred to as BICM trivial (BICM-T). In this paper, we analyze a generalized BICM-T system that uses a nonequally spaced signal constellation in conjunction with a bit-level multiplexer in an additive white Gaussian noise (AWGN) channel. As such, one can exploit the full benefit of BICM-T by jointly optimizing different system modules to further improve its performance. We also investigate the performance of the considered BICM-T system in the Gaussian mixture noise (GMN) channel because of its practical importance. The presented numerical results show that an optimized BICM-T system can offer gains up to 1.5 dB over a non-optimized BICM-T system in the AWGN channel for a target bit error rate of $10^{-6}$. The presented results for the GMN channel interestingly reveal that if the strength of the impulsive noise component, i.e., the noise component due to some ambient phenomenon in the GMN, is below a certain threshold level, then the BICM-T system performs significantly better as compared to traditional BICM system.

  17. Quaternion analysis of generalized electromagnetic fields in chiral media

    International Nuclear Information System (INIS)

    Bisht, P. S. . Email. ps_bisht123@rediffmail.com

    2007-01-01

    The time dependent Maxwell's equations in presence of electric and magnetic charges has been developed in chiral media and the solutions for the classical problem are obtained in unique, simple and consistent manner. The quaternionic reformulation of generalized electromagnetic fields in chiral media has also been developed in compact and consistent way. Simulation of neutron backscattering process applied to organic material detection. Forero Martinez, Nancy Carolina; Cristancho, Fernando (Nuclear Physics Group, Universidad Nacional de Colombia, Bogota D.C. (Colombia)) Abstract Atomic and nuclear physics based sensors might offer new possibilities in de-mining. There is a particular interest in the possibility of using neutrons for the non-intrusive detection of hidden contraband, explosives or illicit drugs. The Neutron Backscattering Technique, based on the detection of the produced thermal neutrons, is known to be a useful tool to detect hidden explosives which present an elevated concentration of light elements (H, C, N, O). In this way we present the simulated results using the program package Geant4. Different variables were modified including the soil composition and the studied materials. (Author)

  18. Analysis of general specifications for nuclear facilities environmental monitoring vehicles

    International Nuclear Information System (INIS)

    Xu Xiaowei

    2014-01-01

    At present, with the nuclear energy more increasingly extensive application, the continuous stable radiation monitoring has become the focus of the public attention. The main purpose of the environmental monitoring vehicle for the continuous monitoring of the environmental radiation dose rate and the radionuclides concentration in the medium around nuclear facilities is that the environmental radiation level and the radioactive nuclides activity in the environment medium are measured. The radioactive pollution levels, the scope contaminated and the trends of the pollution accumulation are found out. The change trends for the pollution are observed and the monitoring results are explained. The domestic demand of the environmental monitoring for the nuclear facilities is shown in this report. The changes and demands of the routine environmental monitoring and the nuclear emergency monitoring are researched. The revision opinions for EJ/T 981-1995 General specifications for nuclear facilities environmental monitoring vehicles are put forward. The purpose is to regulate domestic environmental monitoring vehicle technical criterion. The criterion makes it better able to adapt and serve the environmental monitoring for nuclear facilities. The technical guarantee is provided for the environmental monitoring of the nuclear facilities. (authors)

  19. Analysis of a convenient information bound for general quantum channels

    International Nuclear Information System (INIS)

    O'Loan, C J

    2007-01-01

    Open questions from Sarovar and Milburn (2006 J. Phys. A: Math. Gen. 39 8487) are answered. Sarovar and Milburn derived a convenient upper bound for the Fisher information of a one-parameter quantum channel. They showed that for quasi-classical models their bound is achievable and they gave a necessary and sufficient condition for positive operator-valued measures (POVMs) attaining this bound. They asked (i) whether their bound is attainable more generally (ii) whether explicit expressions for optimal POVMs can be derived from the attainability condition. We show that the symmetric logarithmic derivative (SLD) quantum information is less than or equal to the SM bound, i.e., H(θ) ≤ C Y (θ) and we find conditions for equality. As the Fisher information is less than or equal to the SLD quantum information, i.e., F M (θ) ≤ H(θ), we can deduce when equality holds in F M (θ) ≤ C Y (θ). Equality does not hold for all channels. As a consequence, the attainability condition cannot be used to test for optimal POVMs for all channels. These results are extended to multi-parameter channels

  20. Analysis and optimization of bellows with general shape

    International Nuclear Information System (INIS)

    Koh, B.K.; Park, G.J.

    1998-01-01

    Bellows are commonly used in piping systems to absorb expansion and contraction in order to reduce stress. They have widespread applications which include industrial and chemical plants, fossil and nuclear power systems, heating and cooling systems, and vehicle exhaust systems. A bellows is a component in piping systems which absorbs mechanical deformation with flexibility. Its geometry is an axially symmetric shell which consists of two toroidal shells and one annular plate or conical shell. In order to analyze the bellows, this study presents the finite element analysis using a conical frustum shell element. A finite element analysis program is developed to analyze various bellows. The formula for calculating the natural frequency of bellows is made by the simple beam theory. The formula for fatigue life is also derived by experiments. A shape optimal design problem is formulated using multiple objective optimization. The multiple objective functions are transformed to a scalar function with weighting factors. The stiffness, strength, and specified stiffness are considered as the multiple objective function. The formulation has inequality constraints imposed on the natural frequencies, the fatigue limit, and the manufacturing conditions. Geometric parameters of bellows are the design variables. The recursive quadratic programming algorithm is utilized to solve the problem

  1. Regularized generalized eigen-decomposition with applications to sparse supervised feature extraction and sparse discriminant analysis

    DEFF Research Database (Denmark)

    Han, Xixuan; Clemmensen, Line Katrine Harder

    2015-01-01

    We propose a general technique for obtaining sparse solutions to generalized eigenvalue problems, and call it Regularized Generalized Eigen-Decomposition (RGED). For decades, Fisher's discriminant criterion has been applied in supervised feature extraction and discriminant analysis, and it is for...

  2. General metabolism of Laribacter hongkongensis: a genome-wide analysis

    Directory of Open Access Journals (Sweden)

    Curreem Shirly O

    2011-04-01

    Full Text Available Abstract Background Laribacter hongkongensis is associated with community-acquired gastroenteritis and traveler's diarrhea. In this study, we performed an in-depth annotation of the genes and pathways of the general metabolism of L. hongkongensis and correlated them with its phenotypic characteristics. Results The L. hongkongensis genome possesses the pentose phosphate and gluconeogenesis pathways and tricarboxylic acid and glyoxylate cycles, but incomplete Embden-Meyerhof-Parnas and Entner-Doudoroff pathways, in agreement with its asaccharolytic phenotype. It contains enzymes for biosynthesis and β-oxidation of saturated fatty acids, biosynthesis of all 20 universal amino acids and selenocysteine, the latter not observed in Neisseria gonorrhoeae, Neisseria meningitidis and Chromobacterium violaceum. The genome contains a variety of dehydrogenases, enabling it to utilize different substrates as electron donors. It encodes three terminal cytochrome oxidases for respiration using oxygen as the electron acceptor under aerobic and microaerophilic conditions and four reductases for respiration with alternative electron acceptors under anaerobic conditions. The presence of complete tetrathionate reductase operon may confer survival advantage in mammalian host in association with diarrhea. The genome contains CDSs for incorporating sulfur and nitrogen by sulfate assimilation, ammonia assimilation and nitrate reduction. The existence of both glutamate dehydrogenase and glutamine synthetase/glutamate synthase pathways suggests an importance of ammonia metabolism in the living environments that it may encounter. Conclusions The L. hongkongensis genome possesses a variety of genes and pathways for carbohydrate, amino acid and lipid metabolism, respiratory chain and sulfur and nitrogen metabolism. These allow the bacterium to utilize various substrates for energy production and survive in different environmental niches.

  3. Development of a general method for photovoltaic system analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nolay, P

    1987-01-01

    The photovoltaic conversion for energetic applications is now widely used, but its development still needs the resolution of many problems for the sizing and for the real working of the installations. The precise analysis of the components and whole system behaviour has led to the development of accurate models for the simulation of such systems. From this modelling phase, a simulation code has been built. The validation of this software has been achieved from experimental test measurements. Since the quality of the software depends on the precision of the input data, an original method of determination of component characteristics, by means of model identification, has been developed. These tools permit the prediction of system behaviour and the dynamic simulation of systems under real conditions. Used for the study of photovoltaic system sizing, this software has allowed the definition of new concepts which will serve as a basis for the development of a sizing method.

  4. Generalized modal analysis for closed-loop piezoelectric devices

    International Nuclear Information System (INIS)

    Giraud-Audine, Christophe; Giraud, Frédéric; Amberg, Michel; Lemaire-Semail, Betty

    2015-01-01

    Stress in a piezoelectric material can be controlled by imposing an electrical field. Thanks to feedback, this electrical field can be a function of some strain-related measurement so as to confer on the piezoelectric device a closed-loop macroscopic behaviour. In this paper we address the modelling of such a system by extending the modal decomposition methods to account for the closed loop. To do so, the boundary conditions are modified to include the electrical feedback circuit, hence allowing a closed-loop modal analysis. A case study is used to illustrate the theory and to validate it. The main advantage of the method is that design issues such as the coupling factor of the device and closed-loop stability are simultaneously captured. (paper)

  5. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions......Within the field of solid mechanics such as structural dynamics and linearized as well as non-linear stability, the eigenvalue problem plays an important role. In the class of finite element and finite difference discretized problems these engineering problems are characterized by large matrix....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric...

  6. A Generalized Framework for Non-Stationary Extreme Value Analysis

    Science.gov (United States)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA

  7. Generalized structured component analysis a component-based approach to structural equation modeling

    CERN Document Server

    Hwang, Heungsun

    2014-01-01

    Winner of the 2015 Sugiyama Meiko Award (Publication Award) of the Behaviormetric Society of Japan Developed by the authors, generalized structured component analysis is an alternative to two longstanding approaches to structural equation modeling: covariance structure analysis and partial least squares path modeling. Generalized structured component analysis allows researchers to evaluate the adequacy of a model as a whole, compare a model to alternative specifications, and conduct complex analyses in a straightforward manner. Generalized Structured Component Analysis: A Component-Based Approach to Structural Equation Modeling provides a detailed account of this novel statistical methodology and its various extensions. The authors present the theoretical underpinnings of generalized structured component analysis and demonstrate how it can be applied to various empirical examples. The book enables quantitative methodologists, applied researchers, and practitioners to grasp the basic concepts behind this new a...

  8. Algebraic structures in generalized Clifford analysis and applications to boundary value problems

    Directory of Open Access Journals (Sweden)

    José Játem

    2015-12-01

    Full Text Available The present article has a threefold purpose: First it is a survey of the algebraic structures of generalized Clifford-type algebras and shows the main results of the corresponding Clifford-type analysis and its application to boundary value problems known so far. Second it is aimed to implement algorithms to provide the fast and accurate computation of boundary value problems for inhomogeneous equations in the framework of the generalized Clifford analysis. Finally it is also aimed to encourage the development of a generalized discrete Clifford analysis.

  9. Non-monogamy: risk factor for STI transmission and acquisition and determinant of STI spread in populations.

    Science.gov (United States)

    Aral, Sevgi O; Leichliter, Jami S

    2010-12-01

    The concept of concurrent partnerships, while theoretically appealing, has been challenged at many levels. However, non-monogamy may be an important risk factor for the acquisition and transmission of sexually transmitted infections (STI). One's own non-monogamy is a risk factor for transmitting STI to others, partners' non-monogamy is a risk factor for acquiring STI and, most importantly, mutual non-monogamy is a population level determinant of increased STI spread. This study describes the levels, distribution and correlates of non-monogamy, partners' non-monogamy and mutual non-monogamy among adult men and women in the USA. Data from the National Survey of Family Growth (NSFG) Cycle 6 were used. NSFG is a national household survey of subjects aged 15-44 years in the USA. Cochran-Mantel-Haenszel tests and χ(2) tests were used in the analysis. Among sexually active adults, 17.6% of women and 23.0% of men (an estimated 19 million) reported non-monogamy over the past 12 months in 2002. An estimated 11 million Americans (1 in 10) reported partners' non-monogamy and an estimated 8.4 million (7% of women and 10.5% of men) reported mutual non-monogamy. All three types of non-monogamy were reported more frequently by men than women. Younger age, lower education, formerly or never married status, living below the poverty level and having spent time in jail were associated with all three types of non-monogamy in general. The three types of non-monogamy may be helpful in tailoring prevention messages and targeting prevention efforts to subgroups most likely to spread infection.

  10. Biostatistics series module 4: Comparing groups - categorical variables

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Categorical variables are commonly represented as counts or frequencies. For analysis, such data are conveniently arranged in contingency tables. Conventionally, such tables are designated as r × c tables, with r denoting number of rows and c denoting number of columns. The Chi-square (χ2 probability distribution is particularly useful in analyzing categorical variables. A number of tests yield test statistics that fit, at least approximately, a χ2 distribution and hence are referred to as χ2 tests. Examples include Pearson′s χ2 test (or simply the χ2 test, McNemar′s χ2 test, Mantel-Haenszel χ2 test and others. The Pearson′s χ2 test is the most commonly used test for assessing difference in distribution of a categorical variable between two or more independent groups. If the groups are ordered in some manner, the χ2 test for trend should be used. The Fisher′s exact probability test is a test of the independence between two dichotomous categorical variables. It provides a better alternative to the χ2 statistic to assess the difference between two independent proportions when numbers are small, but cannot be applied to a contingency table larger than a two-dimensional one. The McNemar′s χ2 test assesses the difference between paired proportions. It is used when the frequencies in a 2 × 2 table represent paired samples or observations. The Cochran′s Q test is a generalization of the McNemar′s test that compares more than two related proportions. The P value from the χ2 test or its counterparts does not indicate the strength of the difference or association between the categorical variables involved. This information can be obtained from the relative risk or the odds ratio statistic which is measures of dichotomous association obtained from 2 × 2 tables.

  11. An analysis of referrals received by a psychiatric unit in a general ...

    African Journals Online (AJOL)

    An analysis of referrals received by a psychiatric unit in a general hospital part 1: the need for and research design adopted to study referrals received by a psychiatric unit in a general hospital: research. M. Dor, V.J. Ehlers, M.M. Van der Merwe ...

  12. Extension problem for generalized multi-monogenic functions in Clifford analysis

    International Nuclear Information System (INIS)

    Tran Quyet Thang.

    1992-10-01

    The main purpose of this paper is to extend some properties of multi-monogenic functions, which is a generalization of monogenic functions in higher dimensions, for a class of functions satisfying Vekua-type generalized Cauchy-Riemann equations in Clifford Analysis. It is proved that the Hartogs theorem is valid for these functions. (author). 7 refs

  13. Occurence of internet addiction in a general population sample: A latent class analysis

    NARCIS (Netherlands)

    Rumpf, H.J.; Vermulst, A.A.; Bischof, A.; Kastirke, N.; Gürtler, D.; Bischof, G.; Meerkerk, G.J.; John, U.; Meyer, C.

    2014-01-01

    Background: Prevalence studies of Internet addiction in the general population are rare. In addition, a lack of approved criteria hampers estimation of its occurrence. Aims: This study conducted a latent class analysis (LCA) in a large general population sample to estimate prevalence. Methods: A

  14. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  15. The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis

    Science.gov (United States)

    Buri, Olga Elizabeth Minchala; Stefos, Efstathios

    2017-01-01

    The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…

  16. Linking Brief Functional Analysis to Intervention Design in General Education Settings

    Science.gov (United States)

    Ishuin, Tifanie

    2009-01-01

    This study focused on the utility and applicability of brief functional analysis in general education settings. The purpose of the study was to first identify the environmental variables maintaining noncompliance through a brief functional analysis, and then to design and implement a functionally equivalent intervention. The participant exhibited…

  17. Generalized linear models with random effects unified analysis via H-likelihood

    CERN Document Server

    Lee, Youngjo; Pawitan, Yudi

    2006-01-01

    Since their introduction in 1972, generalized linear models (GLMs) have proven useful in the generalization of classical normal models. Presenting methods for fitting GLMs with random effects to data, Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood explores a wide range of applications, including combining information over trials (meta-analysis), analysis of frailty models for survival data, genetic epidemiology, and analysis of spatial and temporal models with correlated errors.Written by pioneering authorities in the field, this reference provides an introduction to various theories and examines likelihood inference and GLMs. The authors show how to extend the class of GLMs while retaining as much simplicity as possible. By maximizing and deriving other quantities from h-likelihood, they also demonstrate how to use a single algorithm for all members of the class, resulting in a faster algorithm as compared to existing alternatives. Complementing theory with examples, many of...

  18. IEEE guide for general principles of reliability analysis of nuclear power generating station protection systems

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Presented is the Institute of Electrical and Electronics Engineers, Inc. (IEEE) guide for general principles of reliability analysis of nuclear power generating station protection systems. The document has been prepared to provide the basic principles needed to conduct a reliability analysis of protection systems. Included is information on qualitative and quantitative analysis, guides for failure data acquisition and use, and guide for establishment of intervals

  19. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  20. Generalization of the Fourier Convergence Analysis in the Neutron Diffusion Eigenvalue Problem

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Noh, Jae Man; Joo, Hyung Kook

    2005-01-01

    Fourier error analysis has been a standard technique for the stability and convergence analysis of linear and nonlinear iterative methods. Lee et al proposed new 2- D/1-D coupling methods and demonstrated several advantages of the new methods by performing a Fourier convergence analysis of the methods as well as two existing methods for a fixed source problem. We demonstrated the Fourier convergence analysis of one of the 2-D/1-D coupling methods applied to a neutron diffusion eigenvalue problem. However, the technique cannot be used directly to analyze the convergence of the other 2-D/1-D coupling methods since some algorithm-specific features were used in our previous study. In this paper we generalized the Fourier convergence analysis technique proposed and analyzed the convergence of the 2-D/1-D coupling methods applied to a neutron diffusion Eigenvalue problem using the generalized technique

  1. Idade materna como fator de risco: estudo com primigestas na faixa etária igual ou superior a 28 anos La edad materna como un factor de riesgo: estudio com primigestas en la facha etaria igual o superior a 28 años Maternal age as a risk factor: a study on first time pregnant women with age equal or higher than 28 years old

    Directory of Open Access Journals (Sweden)

    Cristina Maria Garcia de Lima Parada

    1999-10-01

    después de controlado el tipo de parto, para las siguientes complicaciones perinatales: taquipnea transitoria del recien nacido, cianosis generalizada ao nacer e infección neonatal.This is a transverse-type designed study with the aim analise maternal age as a risk factor, and as not a risk factor, through the verification of incidents during pregnancy, birth and puerperium of first time pregnant women with age equal or higher than 28 years old; as well as birth conditions and discharge of their newborns, comparing them with a group of first time pregnant women from 20 to 27 years old. The study was carried out in Botucatu, São Paulo, from January, 1990 to June, 1995. The statistical analysis, discussed at the level of 5% of significance, was developed through Mann-Whitney test, Goodman test and the evaluation of relative risk and corrected relative risk through Mantel-Haenszel technique. We concluded that maternal age equal or higher than 28 years old, is not a pregnancy, puerperal and intrapartum risk factor, although, on the other hand, it was a risk factor even after controlled parturition for the following perinatal incidents: newborn tachypnea, generalized cyanosis at birth and neonatal infection.

  2. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    OpenAIRE

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory prof...

  3. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  4. Dental treatment under general anesthesia for special-needs patients: analysis of the literature.

    Science.gov (United States)

    Mallineni, Sreekanth K; Yiu, Cynthia K Y

    2016-11-01

    The aim of the present review was to identify the studies published on dental treatment under general anesthesia for special-needs patients. A comprehensive search of the reported literature from January 1966 to May 2012 was conducted using PubMed, Medline, and Embase. Keywords used in the search were "dental treatment under general anesthesia", "special-needs patients", "medically-compromised patients", and "children", in various combinations. Studies published only on dental treatment under general anesthesia and in English were included. Only 10 studies were available for final analysis. Age range from 1 to 50 years, and restorative procedures, were most prevalent. Only two studies discussed repeated general anesthesia, with rates of 7.2% and 10.2%. Over time, the provision of general anesthesia for special-needs patients has changed from dental clinics to general hospitals. The demand for dental treatment for special-needs patients under general anesthesia continues to increase. Currently, there are no certain accepted protocols for the provision of dental treatment under general anesthesia. © 2015 Wiley Publishing Asia Pty Ltd.

  5. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  6. Use of multivariate extensions of generalized linear models in the analysis of data from clinical trials

    OpenAIRE

    ALONSO ABAD, Ariel; Rodriguez, O.; TIBALDI, Fabian; CORTINAS ABRAHANTES, Jose

    2002-01-01

    In medical studies the categorical endpoints are quite often. Even though nowadays some models for handling this multicategorical variables have been developed their use is not common. This work shows an application of the Multivariate Generalized Linear Models to the analysis of Clinical Trials data. After a theoretical introduction models for ordinal and nominal responses are applied and the main results are discussed. multivariate analysis; multivariate logistic regression; multicategor...

  7. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  8. The use of office-based sedation and general anesthesia by board certified pediatric dentists practicing in the United States.

    Science.gov (United States)

    Olabi, Nassim F; Jones, James E; Saxen, Mark A; Sanders, Brian J; Walker, Laquia A; Weddell, James A; Schrader, Stuart M; Tomlin, Angela M

    2012-01-01

    The purpose of this study is to explore the use of office-based sedation by board-certified pediatric dentists practicing in the United States. Pediatric dentists have traditionally relied upon self-administered sedation techniques to provide office-based sedation. The use of dentist anesthesiologists to provide office-based sedation is an emerging trend. This study examines and compares these two models of office-based sedations. A survey evaluating office-based sedation of diplomates of the American Board of Pediatric Dentistry (ABPD) based on gender, age, years in practice, practice types, regions, and years as a diplomate of the ABPD was completed by 494 active members. The results were summarized using frequencies and percentages. Relationships of dentist age, gender, and number of years in practice with the use of intravenous (IV) sedation was completed using two-way contingency tables and Mantel-Haenszel tests for ordered categorical data. Relationships of office-based sedation use and the type of one's practice were examined using Pearson chi-square tests. Of the 1917 surveys e-mailed, 494 completed the survey for a response rate of 26%. Over 70% of board-certified US pediatric dentists use some form of sedation in their offices. Less than 20% administer IV sedation, 20 to 40% use a dentist anesthesiologist, and 60 to 70% would use dentist anesthesiologists if one were available.

  9. Neutron activation analysis with k{sub 0}-standardisation : general formalism and procedure

    Energy Technology Data Exchange (ETDEWEB)

    Pomme, S.; Hardeman, F. [Centre de l`Etude de l`Energie Nucleaire, Mol (Belgium); Robouch, P.; Etxebarria, N.; Arana, G. [European Commission, Joint Research Centre, Institute for Reference Materials and Measurements, Geel (Belgium)

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k{sub 0}-standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k{sub 0}-definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k{sub 0}-standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented.

  10. Cost analysis of injection laryngoplasty performed under local anaesthesia versus general anaesthesia: an Australian perspective.

    Science.gov (United States)

    Chandran, D; Woods, C M; Schar, M; Ma, N; Ooi, E H; Athanasiadis, T

    2018-02-01

    To conduct a cost analysis of injection laryngoplasty performed in the operating theatre under local anaesthesia and general anaesthesia. The retrospective study included patients who had undergone injection laryngoplasty as day cases between July 2013 and March 2016. Cost data were obtained, along with patient demographics, anaesthetic details, type of injectant, American Society of Anesthesiologists score, length of stay, total operating theatre time and surgeon procedure time. A total of 20 cases (general anaesthesia = 6, local anaesthesia = 14) were included in the cost analysis. The mean total cost under general anaesthesia (AU$2865.96 ± 756.29) was significantly higher than that under local anaesthesia (AU$1731.61 ± 290.29) (p costs. Procedures performed under local anaesthesia in the operating theatre are associated with shorter operating theatre time and length of stay in the hospital, and provide significant cost savings. Further savings could be achieved if local anaesthesia procedures were performed in the office setting.

  11. Neutron activation analysis with k0-standardisation : general formalism and procedure

    International Nuclear Information System (INIS)

    Pomme, S.; Hardeman, F.; Robouch, P.; Etxebarria, N.; Arana, G.

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k 0 -standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k 0 -definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k 0 -standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented

  12. General Education Default and Student Benefit in Inclusive Learning Environments: An Analysis for School Leaders

    Science.gov (United States)

    Menard, Lauren A.

    2011-01-01

    A contextual analysis of the general education default and student benefit is presented from the perspective of school-based compliance with federal mandates from IDEIA [Individuals with Disabilities Education Improvement Act] of 2004. A goal was to inform school administrators striving to develop and maintain effective, inclusive learning…

  13. A General Model for Representing Arbitrary Unsymmetries in Various Types of Network Analysis

    DEFF Research Database (Denmark)

    Rønne-Hansen, Jan

    1997-01-01

    When dealing with unsymmetric faults various proposals have been put forward. In general they have been characterized by specific treatment of the single fault in accordance with the structure and impedances involved. The model presented is based on node equations and was originally developed for...... complicated fault situation which has not been treated before for traditional transient stability analysis...

  14. Regulating food law : risk analysis and the precautionary principle as general principles of EU food law

    NARCIS (Netherlands)

    Szajkowska, A.

    2012-01-01

    In food law scientific evidence occupies a central position. This study offers a legal insight into risk analysis and the precautionary principle, positioned in the EU as general principles applicable to all food safety measures, both national and EU. It develops a new method of looking at these

  15. Robust periodic steady state analysis of autonomous oscillators based on generalized eigenvalues

    NARCIS (Netherlands)

    Mirzavand, R.; Maten, ter E.J.W.; Beelen, T.G.J.; Schilders, W.H.A.; Abdipour, A.

    2011-01-01

    In this paper, we present a new gauge technique for the Newton Raphson method to solve the periodic steady state (PSS) analysis of free-running oscillators in the time domain. To find the frequency a new equation is added to the system of equations. Our equation combines a generalized eigenvector

  16. Robust periodic steady state analysis of autonomous oscillators based on generalized eigenvalues

    NARCIS (Netherlands)

    Mirzavand, R.; Maten, ter E.J.W.; Beelen, T.G.J.; Schilders, W.H.A.; Abdipour, A.; Michielsen, B.; Poirier, J.R.

    2012-01-01

    In this paper, we present a new gauge technique for the Newton Raphson method to solve the periodic steady state (PSS) analysis of free-running oscillators in the time domain. To find the frequency a new equation is added to the system of equations. Our equation combines a generalized eigenvector

  17. Generalized canonical analysis based on optimizing matrix correlations and a relation with IDIOSCAL

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Cléroux, R.; Ten Berge, Jos M.F.

    1994-01-01

    Carroll's method for generalized canonical analysis of two or more sets of variables is shown to optimize the sum of squared inner-product matrix correlations between a consensus matrix and matrices with canonical variates for each set of variables. In addition, the method that analogously optimizes

  18. A general purpose program system for high energy physics experiment data acquisition and analysis

    International Nuclear Information System (INIS)

    Li Shuren; Xing Yuguo; Jin Bingnian

    1985-01-01

    This paper introduced the functions, structure and system generation of a general purpose program system (Fermilab MULTI) for high energy physics experiment data acquisition and analysis. Works concerning the reconstruction of MULTI system level 0.5 which can be run on the computer PDP-11/23 are also introduced briefly

  19. Generalized canonical correlation analysis of matrices with missing rows : A simulation study

    NARCIS (Netherlands)

    van de Velden, Michel; Bijmolt, Tammo H. A.

    A method is presented for generalized canonical correlation analysis of two or more matrices with missing rows. The method is a combination of Carroll's (1968) method and the missing data approach of the OVERALS technique (Van der Burg, 1988). In a simulation study we assess the performance of the

  20. Computer-assisted semen analysis parameters as predictors for fertility of men from the general population

    DEFF Research Database (Denmark)

    Larsen, L; Scheike, Thomas Harder; Jensen, Tina Kold

    2000-01-01

    The predictive value of sperm motility parameters obtained by computer-assisted semen analysis (CASA) was evaluated for the fertility of men from general population. In a prospective study with couples stopping use of contraception in order to try to conceive, CASA was performed on semen samples...

  1. An examination of generalized anxiety disorder and dysthymic disorder by latent class analysis

    NARCIS (Netherlands)

    Rhebergen, D.; van der Steenstraten, I.M.; Sunderland, M.; de Graaf, R.; ten Have, M.; Lamers, F.; Penninx, B.W.J.H.; Andrews, G.

    2014-01-01

    Background The nosological status of generalized anxiety disorder (GAD) versus dysthymic disorder (DD) has been questioned. The aim of this study was to examine qualitative differences within (co-morbid) GAD and DD symptomatology. Method Latent class analysis was applied to anxious and depressive

  2. A Note on McDonald's Generalization of Principal Components Analysis

    Science.gov (United States)

    Shine, Lester C., II

    1972-01-01

    It is shown that McDonald's generalization of Classical Principal Components Analysis to groups of variables maximally channels the totalvariance of the original variables through the groups of variables acting as groups. An equation is obtained for determining the vectors of correlations of the L2 components with the original variables.…

  3. Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2014-01-01

    The purpose of this study is to identify academically at-risk students in first-semester general chemistry using affective characteristics via cluster analysis. Through the clustering of six preselected affective variables, three distinct affective groups were identified: low (at-risk), medium, and high. Students in the low affective group…

  4. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    Science.gov (United States)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  5. Generalized modeling of the fractional-order memcapacitor and its character analysis

    Science.gov (United States)

    Guo, Zhang; Si, Gangquan; Diao, Lijie; Jia, Lixin; Zhang, Yanbin

    2018-06-01

    Memcapacitor is a new type of memory device generalized from the memristor. This paper proposes a generalized fractional-order memcapacitor model by introducing the fractional calculus into the model. The generalized formulas are studied and the two fractional-order parameter α, β are introduced where α mostly affects the fractional calculus value of charge q within the generalized Ohm's law and β generalizes the state equation which simulates the physical mechanism of a memcapacitor into the fractional sense. This model will be reduced to the conventional memcapacitor as α = 1 , β = 0 and to the conventional memristor as α = 0 , β = 1 . Then the numerical analysis of the fractional-order memcapacitor is studied. And the characteristics and output behaviors of the fractional-order memcapacitor applied with sinusoidal charge are derived. The analysis results have shown that there are four basic v - q and v - i curve patterns when the fractional order α, β respectively equal to 0 or 1, moreover all v - q and v - i curves of the other fractional-order models are transition curves between the four basic patterns.

  6. Cost-effectiveness analysis of HPV vaccination: comparing the general population with socially vulnerable individuals.

    Science.gov (United States)

    Han, Kyu-Tae; Kim, Sun Jung; Lee, Seo Yoon; Park, Eun-Cheol

    2014-01-01

    After the WHO recommended HPV vaccination of the general population in 2009, government support of HPV vaccination programs was increased in many countries. However, this policy was not implemented in Korea due to perceived low cost-effectiveness. Thus, the aim of this study was to analyze the cost-utility of HPV vaccination programs targeted to high risk populations as compared to vaccination programs for the general population. Each study population was set to 100,000 people in a simulation study to determine the incremental cost-utility ratio (ICUR), then standard prevalence rates, cost, vaccination rates, vaccine efficacy, and the Quality-Adjusted Life-Years (QALYs) were applied to the analysis. In addition, sensitivity analysis was performed by assuming discounted vaccination cost. In the socially vulnerable population, QALYs gained through HPV vaccination were higher than that of the general population (General population: 1,019, Socially vulnerable population: 5,582). The results of ICUR showed that the cost of HPV vaccination was higher for the general population than the socially vulnerable population. (General population: 52,279,255 KRW, Socially vulnerable population: 9,547,347 KRW). Compared with 24 million KRW/QALYs as the social threshold, vaccination of the general population was not cost-effective. In contrast, vaccination of the socially vulnerable population was strongly cost-effective. The results suggest the importance and necessity of government support of HPV vaccination programs targeted to socially vulnerable populations because a targeted approach is much more cost-effective. The implementation of government support for such vaccination programs is a critical strategy for decreasing the burden of HPV infection in Korea.

  7. A General Model for Thermal, Hydraulic and Electric Analysis of Superconducting Cables

    CERN Document Server

    Bottura, L; Rosso, C

    2000-01-01

    In this paper we describe a generic, multi-component and multi-channel model for the analysis of superconducting cables. The aim of the model is to treat in a general and consistent manner simultaneous thermal, electric and hydraulic transients in cables. The model is devised for most general situations, but reduces in limiting cases to most common approximations without loss of efficiency. We discuss here the governing equations, and we write them in a matrix form that is well adapted to numerical treatment. We finally demonstrate the model capability by comparison with published experimental data on current distribution in a two-strand cable.

  8. A unified MGF-based capacity analysis of diversity combiners over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-03-01

    Unified exact ergodic capacity results for L-branch coherent diversity combiners including equal-gain combining (EGC) and maximal-ratio combining (MRC) are not known. This paper develops a novel generic framework for the capacity analysis of L-branch EGC/MRC over generalized fading channels. The framework is used to derive new results for the gamma-shadowed generalized Nakagami-m fading model which can be a suitable model for the fading environments encountered by high frequency (60 GHz and above) communications. The mathematical formalism is illustrated with some selected numerical and simulation results confirming the correctness of our newly proposed framework. © 2012 IEEE.

  9. Dynamic analysis of a new chaotic system with fractional order and its generalized projective synchronization

    International Nuclear Information System (INIS)

    Niu Yu-Jun; Wang Xing-Yuan; Nian Fu-Zhong; Wang Ming-Jun

    2010-01-01

    Based on the stability theory of the fractional order system, the dynamic behaviours of a new fractional order system are investigated theoretically. The lowest order we found to have chaos in the new three-dimensional system is 2.46, and the period routes to chaos in the new fractional order system are also found. The effectiveness of our analysis results is further verified by numerical simulations and positive largest Lyapunov exponent. Furthermore, a nonlinear feedback controller is designed to achieve the generalized projective synchronization of the fractional order chaotic system, and its validity is proved by Laplace transformation theory. (general)

  10. The semi-Markov process. Generalizations and calculation rules for application in the analysis of systems

    International Nuclear Information System (INIS)

    Hirschmann, H.

    1983-06-01

    The consequences of the basic assumptions of the semi-Markov process as defined from a homogeneous renewal process with a stationary Markov condition are reviewed. The notion of the semi-Markov process is generalized by its redefinition from a nonstationary Markov renewal process. For both the nongeneralized and the generalized case a representation of the first order conditional state probabilities is derived in terms of the transition probabilities of the Markov renewal process. Some useful calculation rules (regeneration rules) are derived for the conditional state probabilities of the semi-Markov process. Compared to the semi-Markov process in its usual definition the generalized process allows the analysis of a larger class of systems. For instance systems with arbitrarily distributed lifetimes of their components can be described. There is also a chance to describe systems which are modified during time by forces or manipulations from outside. (Auth.)

  11. Application of generalized function to dynamic analysis of elasto-plastic thick plates

    International Nuclear Information System (INIS)

    Zheng, D.; Weng, Z.

    1987-01-01

    The elasto-plastic dynamic analysis of thick plates is of great significance to the research and the design on an anti-seismic structure and an anti-explosive structure. In this paper, the derivative of δ-function is handled by using the generalized function. The dynamic influence coefficient of thick plates in deduced. A dynamic response of elasto-plastic thick plates its material has hardening behaviour considered, is analysed by using known elastic solutions. The general expressions for the dynamic response of elasto-plastic rectangular thick plates subjected arbitrary loads are given. Detailed computations are performed for the square plates of various height-span ratios. The results are compared with those obtained from the improved theory and the classical theory of plates. The modification of the classical deflection theory for plates is employed. The increment analysis is used for calculations. The yield function is considered as a function of inplane and transverse shear stresses. (orig./GL)

  12. An analysis of radiological research publications in high impact general medical journals between 1996 and 2010.

    Science.gov (United States)

    Ku, You Jin; Yoon, Dae Young; Yun, Eun Joo; Baek, Sora; Lim, Kyoung Ja; Seo, Young Lan; Choi, Chul Soon; Bae, Sang Hoon

    2013-06-01

    To evaluate scientific papers published by radiologists in high impact general medical journals between 1996 and 2010. A MEDLINE search was performed in five high impact general medical journals (AIM, BMJ, JAMA, Lancet, and NEJM) for all articles of which a radiologist was the first author between 1996 and 2010. The following information was abstracted from the original articles: radiological subspecialty, imaging technique used, type of research, sample size, study design, statistical analysis, study outcome, declared funding, number of authors, collaboration, and country of the first author. Of 216 (0.19%) articles were published by radiologists in five general medical journals between 1996 and 2010, 83 were original articles. Fifteen (18.1%) original articles were concerned with the field of vascular/interventional radiology, 24 (28.9%) used combined imaging techniques, 76 (91.6%) were clinical research, 63 (75.9%) had a sample size of >50, 65 (78.3%) were prospective, 78 (94.0%) performed statistical analysis, 83 (100%) showed positive study outcomes, 57 (68.7%) were funded, 49 (59.0%) had from four to seven authors, and 79 (95.2%) were collaborative studies. A very small number (0.19%) in five high impact general medical journals was published by radiologists between 1996 and 2010. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Aberration analysis for freeform surface terms overlay on general decentered and tilted optical surfaces.

    Science.gov (United States)

    Yang, Tong; Cheng, Dewen; Wang, Yongtian

    2018-03-19

    Aberration theory helps designers to better understand the nature of imaging systems. However, the existing aberration theory of freeform surfaces has many limitations. For example, it only works in the special case when the central area of the freeform surface is used. In addition, the light footprint is limited to a circle, which does not match the case of an elliptical footprint for general systems. In this paper, aberrations generated by freeform surface term overlay on general decentered and tilted optical surfaces are analyzed. For the case when the off-axis section of a freeform surface is used, the aberration equation for using stop and nonstop surfaces is discussed, and the aberrations generated by Zernike terms up to Z 17/18 are analyzed in detail. To solve the problem of the elliptical light footprint for tilted freeform surfaces, the scaled pupil vector is used in the aberration analysis. The mechanism of aberration transformation is discovered, and the aberrations generated by different Zernike terms in this case are calculated. Finally we proposed aberration equations for freeform terms on general decentered and tilted freeform surfaces. The research result given in this paper offers an important reference for optical designers and engineers, and it is of great importance in developing analytical methods for general freeform system design, tolerance analysis, and system assembly.

  14. Bias formulas for sensitivity analysis of unmeasured confounding for general outcomes, treatments, and confounders.

    Science.gov (United States)

    Vanderweele, Tyler J; Arah, Onyebuchi A

    2011-01-01

    Uncontrolled confounding in observational studies gives rise to biased effect estimates. Sensitivity analysis techniques can be useful in assessing the magnitude of these biases. In this paper, we use the potential outcomes framework to derive a general class of sensitivity-analysis formulas for outcomes, treatments, and measured and unmeasured confounding variables that may be categorical or continuous. We give results for additive, risk-ratio and odds-ratio scales. We show that these results encompass a number of more specific sensitivity-analysis methods in the statistics and epidemiology literature. The applicability, usefulness, and limits of the bias-adjustment formulas are discussed. We illustrate the sensitivity-analysis techniques that follow from our results by applying them to 3 different studies. The bias formulas are particularly simple and easy to use in settings in which the unmeasured confounding variable is binary with constant effect on the outcome across treatment levels.

  15. A bibliometric analysis of Australian general practice publications from 1980 to 2007 using PubMed

    Directory of Open Access Journals (Sweden)

    Kumara Mendis

    2010-12-01

    Discussion Australian GP publications have shown an impressive growth from 1980 to 2007 with a 15- fold increase. This increase may be due in part to the actions of the Australian government over the past decade to financially support research in primary care, as well as the maturing of academic general practice. This analysis can assist governments, researchers, policy makers and others to target resources so that further developments can be encouraged, supported and monitored.

  16. Use of generalized ordered logistic regression for the analysis of multidrug resistance data.

    Science.gov (United States)

    Agga, Getahun E; Scott, H Morgan

    2015-10-01

    Statistical analysis of antimicrobial resistance data largely focuses on individual antimicrobial's binary outcome (susceptible or resistant). However, bacteria are becoming increasingly multidrug resistant (MDR). Statistical analysis of MDR data is mostly descriptive often with tabular or graphical presentations. Here we report the applicability of generalized ordinal logistic regression model for the analysis of MDR data. A total of 1,152 Escherichia coli, isolated from the feces of weaned pigs experimentally supplemented with chlortetracycline (CTC) and copper, were tested for susceptibilities against 15 antimicrobials and were binary classified into resistant or susceptible. The 15 antimicrobial agents tested were grouped into eight different antimicrobial classes. We defined MDR as the number of antimicrobial classes to which E. coli isolates were resistant ranging from 0 to 8. Proportionality of the odds assumption of the ordinal logistic regression model was violated only for the effect of treatment period (pre-treatment, during-treatment and post-treatment); but not for the effect of CTC or copper supplementation. Subsequently, a partially constrained generalized ordinal logistic model was built that allows for the effect of treatment period to vary while constraining the effects of treatment (CTC and copper supplementation) to be constant across the levels of MDR classes. Copper (Proportional Odds Ratio [Prop OR]=1.03; 95% CI=0.73-1.47) and CTC (Prop OR=1.1; 95% CI=0.78-1.56) supplementation were not significantly associated with the level of MDR adjusted for the effect of treatment period. MDR generally declined over the trial period. In conclusion, generalized ordered logistic regression can be used for the analysis of ordinal data such as MDR data when the proportionality assumptions for ordered logistic regression are violated. Published by Elsevier B.V.

  17. Implications of the Biofuels Boom for the Global Livestock Industry: A Computable General Equilibrium Analysis

    OpenAIRE

    Taheripour, Farzad; Hertel, Thomas W.; Tyner, Wallace E.

    2009-01-01

    In this paper, we offer a general equilibrium analysis of the impacts of US and EU biofuel mandates for the global livestock sector. Our simulation boosts biofuel production in the US and EU from 2006 levels to mandated 2015 levels. We show that mandates will encourage crop production in both biofuel and non biofuel producing regions, while reducing livestock and livestock production in most regions of the world. The non-ruminant industry curtails its production more than other livestock indu...

  18. Assimilation of tourism satellite accounts and applied general equilibrium models to inform tourism policy analysis

    OpenAIRE

    Rossouw, Riaan; Saayman, Melville

    2011-01-01

    Historically, tourism policy analysis in South Africa has posed challenges to accurate measurement. The primary reason for this is that tourism is not designated as an 'industry' in standard economic accounts. This paper therefore demonstrates the relevance and need for applied general equilibrium (AGE) models to be completed and extended through an integration with tourism satellite accounts (TSAs) as a tool for policy makers (especially tourism policy makers) in South Africa. The paper sets...

  19. General analysis of corrections to the standard seesaw formula in grand unified models

    International Nuclear Information System (INIS)

    Barr, S.M.; Kyae, Bumseok

    2004-01-01

    In realistic grand unified models there are typically extra vectorlike matter multiplets at the GUT-scale that are needed to explain the family hierarchy. These contain neutrinos that, when integrated out, can modify the usual neutrino seesaw formula. A general analysis is given. It is noted that such modifications can explain why the neutrinos do not exhibit a strong family hierarchy like the other types of fermions

  20. Generalized Population Analysis of Three-Center Two-Electron Bonding

    Czech Academy of Sciences Publication Activity Database

    Ponec, Robert; Cooper, D. L.

    2004-01-01

    Roč. 97, č. 6 (2004), s. 1002-1011 ISSN 0020-7608 R&D Projects: GA AV ČR IAA4072006; GA MŠk OC D9.20 Institutional research plan: CEZ:AV0Z4072921 Keywords : multicenter bonding * generalized population analysis * post-Hartree Fock wave functions Subject RIV: CF - Physical ; The oretical Chemistry Impact factor: 1.392, year: 2004

  1. Ultimate load analysis of prestressed concrete reactor pressure vessels considering a general material law

    International Nuclear Information System (INIS)

    Schimmelpfennig, K.

    1975-01-01

    A method of analysis is presented, by which progressive fracture processes in axisymmetric prestressed concrete pressure vessels during increasing internal pressure can be evaulated by means of a continuum calculation considering a general material law. Formulations used in the analysis concerning material behaviour are derived on one hand from appropriate results of testing small concrete specimens, and are on the other hand gained by parametric studies in order to solve questions still existing by recalulating fracture tests on concrete bodies with more complex states of stress. Due attention is focussed on investigating the behaviour of construction members subjected to high shear forces (end slabs.). (Auth.)

  2. Mortality among workers exposed to external ionizing radiation at a nuclear facility in Ohio

    International Nuclear Information System (INIS)

    Wiggs, L.D.; Cox-DeVore, C.A.; Wilkinson, G.S.; Reyes, M.

    1991-01-01

    In a cohort mortality study of white men employed by the Mound Facility (1947 through 1979), observed deaths did not exceed those expected based on US death rates for the overall cohort or for the subcohort monitored for external ionizing radiation. Among the radiation-monitored subcohort, mortality for workers with cumulative radiation doses of at least 10 mSv was not significantly increased when compared with mortality for coworkers with cumulative doses of less than 10 mSv. A significant dose-response based on a Mantel-Haenszel test of trend was observed for all leukemias. However, when a death from chronic lymphatic leukemia, a type of leukemia generally not regarded as radiogenic, was removed from the analyses, the strength of the dose-response was reduced

  3. Cuestionario general de salud-12: análisis de factores en población general de Bucaramanga, Colombia General health questionnaire-12: factor analysis in the general population of Bucaramanga, Colombia

    Directory of Open Access Journals (Sweden)

    Adalberto Campo Arias

    2007-03-01

    Full Text Available ANTECEDENTES: el cuestionario general de salud de doce puntos (GHQ-12 es un instrumento diseñado para cuantificar síntomas emocionales (trastornos mentales comunes en diferentes contextos. Sin embargo, no se conoce la estructura factorial de esta versión en población colombiana. OBJETIVO: conocer la estructura de factores y la mejor forma de puntuación para el GHQ-12 en población general de Bucaramanga, Colombia. MÉTODO: una muestra probabilística de la población general, representada por 2.496 adultos entre 18 y 65 años, diligenció el GHQ-12. Se determinó la consistencia interna y los factores para las formas de puntuación ordinal (0-1-2-3 y binarias (0-0-0-1, 0-0-1-1 y 0-1-1-1. RESULTADOS: la forma ordinal mostró una consistencia interna de 0,779 y dos factores que explicaban 42,5% de la varianza. Por otro lado, la forma binaria 0-1-1-1 presentó una consistencia interna de 0,774; la forma 0-0-1-1, de 0,708; y la forma 0-0-0-1, de 0,360. Las formas binarias 0-1-1-1 y 0-0-1-1 mostraron tres factores responsables de 50,6% y 46,3%, respectivamente. La forma 0-0-0-1 no presentó una aceptable solución factorial. CONCLUSIONES: en Bucaramanga, la forma de puntuación ordinal presenta un mejor comportamiento psicométrico que las binarias. Todas las formas de puntuación muestran alta consistencia interna y aceptable solución factorial, excepto la forma 0-0-0-1 BACKGROUND: The General Health Questionnaire (GHQ-12 was designed to quantify emotional symptoms (common mental disorders in different settings. However, its factor structure is not known in Colombian population. OBJECTIVE: To establish the factor structure and the best form of punctuation for the GHQ-12 in the general population of Bucaramanga, Colombia. METHOD: A probabilistic sample of 2,496 adults from the general population in Bucaramanga, Colombia, completed the GHQ-12. The internal consistency and analysis factor were computed for the ordinal scoring (0-1-2-3 and for

  4. An analysis of radiological research publications in high impact general medical journals between 1996 and 2010

    International Nuclear Information System (INIS)

    Ku, You Jin; Yoon, Dae Young; Yun, Eun Joo; Baek, Sora; Lim, Kyoung Ja; Seo, Young Lan; Choi, Chul Soon; Bae, Sang Hoon

    2013-01-01

    Highlights: ► Radiologists published only 0.2% of articles in five general medical journals. ► Most original articles from radiologists were funded and were prospective studies. ► Radiology researchers from only 11 countries published at least one original article. -- Abstract: Objective: To evaluate scientific papers published by radiologists in high impact general medical journals between 1996 and 2010. Methods: A MEDLINE search was performed in five high impact general medical journals (AIM, BMJ, JAMA, Lancet, and NEJM) for all articles of which a radiologist was the first author between 1996 and 2010. The following information was abstracted from the original articles: radiological subspecialty, imaging technique used, type of research, sample size, study design, statistical analysis, study outcome, declared funding, number of authors, collaboration, and country of the first author. Results: Of 216 (0.19%) articles were published by radiologists in five general medical journals between 1996 and 2010, 83 were original articles. Fifteen (18.1%) original articles were concerned with the field of vascular/interventional radiology, 24 (28.9%) used combined imaging techniques, 76 (91.6%) were clinical research, 63 (75.9%) had a sample size of >50, 65 (78.3%) were prospective, 78 (94.0%) performed statistical analysis, 83 (100%) showed positive study outcomes, 57 (68.7%) were funded, 49 (59.0%) had from four to seven authors, and 79 (95.2%) were collaborative studies. Conclusions: A very small number (0.19%) in five high impact general medical journals was published by radiologists between 1996 and 2010

  5. Stability analysis on the free surface phenomena of a magnetic fluid for general use

    International Nuclear Information System (INIS)

    Mizuta, Yo

    2011-01-01

    This paper presents an analysis for elucidating a variety of physical processes on the interface (free surface) of magnetic fluid. The present analysis is composed of the magnetic and the fluid analysis, both of which have no limitations concerning the interface elevation or its profile. The magnetic analysis provides rigorous interface magnetic field under arbitrary distributions of applied magnetic field. For the fluid analysis, the equation for interface motion includes all nonlinear effects. Physical quantities such as the interface magnetic field or the interface stresses, obtained first as the wavenumber components, facilitate confirming the relations with those by the conventional theoretical analyses. The nonlinear effect is formulated as the nonlinear mode coupling between the interface profile and the applied magnetic field. The stability of the horizontal interface profile is investigated by the dispersion relation, and summarized as the branch line. Furthermore, the balance among the spectral components of the interface stresses are shown, within the sufficient range of the wavenumber space. - Research Highlights: → General, rigorous but compact analysis for free surface phenomena is shown. → Analysis is applied without limitations on the interface elevation or its profile. → Nonlinear effects are formulated as the nonlinear mode coupling. → Bifurcation of stability is summarized as the branch line. → Balance among the interface stresses are shown in the wavenumber space.

  6. Generalized solution of design optimization and failure analysis of composite drive shaft

    Energy Technology Data Exchange (ETDEWEB)

    Kollipalli, K.; Shivaramakrishna, K.V.S.; Prabhakaran, R.T.D. [Birla Institute of Technology and Science, Goa (India)

    2012-07-01

    Composites have an edge over conventional metals like steel and aluminum due to higher stiffness-to-weight ratio and strength-to-weight ratio. Due to these advantages, composites can bring out a revolutionary change in materials used in automotive engineering, as weight savings has positive impacts on other attributes like fuel economy and possible noise, vibration and harshness (NVH). In this paper, the drive line system of an automotive system is targeted for use of composites by keeping constraints in view such as such as torque transmission, torsional buckling load and fundamental natural frequency. Composite drive shafts made of three different composites ('HM Carbon/HS Carbon/E-glass'-epoxy) was modeled using Catia V5R16 CPD workbench and a finite element analysis with boundary conditions, fiber orientation and stacking sequence was performed using ANSYS Composite module. Results obtained were compared to theoretical results and were found to be accurate and in the limits. This paper also speaks on drive shaft modeling and analysis generalization i.e., changes in stacking sequence in the future can be incorporated directly into ANSYS model without modeling it again in Catia. Hence the base model and analysis method made up in this analysis generalization facilitated by CAD/CAE can be used to carry out any composite shaft design optimization process. (Author)

  7. General practice-based clinical trials in Germany - a problem analysis

    Directory of Open Access Journals (Sweden)

    Hummers-Pradier Eva

    2012-11-01

    Full Text Available Abstract Background In Germany, clinical trials and comparative effectiveness studies in primary care are still very rare, while their usefulness has been recognised in many other countries. A network of researchers from German academic general practice has explored the reasons for this discrepancy. Methods Based on a comprehensive literature review and expert group discussions, problem analyses as well as structural and procedural prerequisites for a better implementation of clinical trials in German primary care are presented. Results In Germany, basic biomedical science and technology is more reputed than clinical or health services research. Clinical trials are funded by industry or a single national programme, which is highly competitive, specialist-dominated, exclusive of pilot studies, and usually favours innovation rather than comparative effectiveness studies. Academic general practice is still not fully implemented, and existing departments are small. Most general practitioners (GPs work in a market-based, competitive setting of small private practices, with a high case load. They have no protected time or funding for research, and mostly no research training or experience. Good Clinical Practice (GCP training is compulsory for participation in clinical trials. The group defined three work packages to be addressed regarding clinical trials in German general practice: (1 problem analysis, and definition of (2 structural prerequisites and (3 procedural prerequisites. Structural prerequisites comprise specific support facilities for general practice-based research networks that could provide practices with a point of contact. Procedural prerequisites consist, for example, of a summary of specific relevant key measures, for example on a web platform. The platform should contain standard operating procedures (SOPs, templates, checklists and other supporting materials for researchers. Conclusion All in all, our problem analyses revealed that

  8. A General Contact Force Analysis of an Under-Actuated Finger in Robot Hand Grasping

    Directory of Open Access Journals (Sweden)

    Xuan Vinh Ha

    2016-02-01

    Full Text Available This paper develops a mathematical analysis of contact forces for the under-actuated finger in a general under-actuated robotic hand during grasping. The concept of under-actuation in robotic grasping with fewer actuators than degrees of freedom (DOF, through the use of springs and mechanical limits, allows the hand to adjust itself to an irregularly shaped object without complex control strategies and sensors. Here the main concern is the contact forces, which are important elements in grasping tasks, based on the proposed mathematical analysis of their distributions of the n-DOF under-actuated finger. The simulation results, along with the 3-DOF finger from the ADAMS model, show the effectiveness of the mathematical analysis method, while comparing them with the measured results. The system can find magnitudes of the contact forces at the contact positions between the phalanges and the object.

  9. Application of discriminant analysis and generalized distance measures to uranium exploration

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Begovich, C.L.; Kane, V.E.; Wolf, D.A.

    1980-01-01

    The National Uranium Resource Evaluation (NURE) Program has as its goal the estimation of the nation's uranium resources. It is possile to use discriminant analysis methods on hydrogeochemical data collected in the NURE Program to aid in fomulating geochemical models that can be used to identify the anomalous areas used in resource estimation. Discriminant' analysis methods have been applied to data from the Plainview, Texas Quadrangle which has approximately 850 groundwater samples with more than 40 quantitative measurements per sample. Discriminant analysis topics involving estimation of misclassification probabilities, variable selection, and robust discrimination are applied. A method using generalized distance measures is given which enables the assignment of samples to a background population or a mineralized population whose parameters were estimated from separate studies. Each topic is related to its relevance in identifying areas of possible interest to uranium exploration. However, the methodology presented here is applicable to the identification of regions associated with other types of resources. 8 figures, 3 tables

  10. Application of discriminant analysis and generalized distance measures to uranium exploration

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Begovich, C.L.; Kane, V.E.; Wolf, D.A.

    1979-10-01

    The National Uranium Resource Evaluation (NURE) Project has as its goal estimation of the nation's uranium resources. It is possible to use discriminant analysis methods on hydrogeochemical data collected in the NURE Program to aid in formulating geochemical models which can be used to identify the anomalous regions necessary for resource estimation. Discriminant analysis methods have been applied to data from the Plainview, Texas Quadrangle which has approximately 850 groundwater samples with more than 40 quantitative measurements per sample. Discriminant analysis topics involving estimation of misclassification probabilities, variable selection, and robust discrimination are applied. A method using generalized distance measures is given which enables assigning samples to a background population or a mineralized population whose parameters were estimated from separate studies. Each topic is related to its relevance in identifying areas of possible interest to uranium exploration

  11. The ICVSIE: A General Purpose Integral Equation Method for Bio-Electromagnetic Analysis.

    Science.gov (United States)

    Gomez, Luis J; Yucel, Abdulkadir C; Michielssen, Eric

    2018-03-01

    An internally combined volume surface integral equation (ICVSIE) for analyzing electromagnetic (EM) interactions with biological tissue and wide ranging diagnostic, therapeutic, and research applications, is proposed. The ICVSIE is a system of integral equations in terms of volume and surface equivalent currents in biological tissue subject to fields produced by externally or internally positioned devices. The system is created by using equivalence principles and solved numerically; the resulting current values are used to evaluate scattered and total electric fields, specific absorption rates, and related quantities. The validity, applicability, and efficiency of the ICVSIE are demonstrated by EM analysis of transcranial magnetic stimulation, magnetic resonance imaging, and neuromuscular electrical stimulation. Unlike previous integral equations, the ICVSIE is stable regardless of the electric permittivities of the tissue or frequency of operation, providing an application-agnostic computational framework for EM-biomedical analysis. Use of the general purpose and robust ICVSIE permits streamlining the development, deployment, and safety analysis of EM-biomedical technologies.

  12. Use of generalized regression models for the analysis of stress-rupture data

    International Nuclear Information System (INIS)

    Booker, M.K.

    1978-01-01

    The design of components for operation in an elevated-temperature environment often requires a detailed consideration of the creep and creep-rupture properties of the construction materials involved. Techniques for the analysis and extrapolation of creep data have been widely discussed. The paper presents a generalized regression approach to the analysis of such data. This approach has been applied to multiple heat data sets for types 304 and 316 austenitic stainless steel, ferritic 2 1 / 4 Cr-1 Mo steel, and the high-nickel austenitic alloy 800H. Analyses of data for single heats of several materials are also presented. All results appear good. The techniques presented represent a simple yet flexible and powerful means for the analysis and extrapolation of creep and creep-rupture data

  13. Fasting insulin, insulin resistance and risk of hypertension in the general population: A meta-analysis.

    Science.gov (United States)

    Wang, Feng; Han, Lili; Hu, Dayi

    2017-01-01

    Studies on the association of fasting insulin concentrations or insulin resistance with subsequent risk of hypertension have yielded conflicting results. To quantitatively assess the association of fasting insulin concentrations or homeostasis model assessment insulin resistance (HOMA-IR) with incident hypertension in a general population by performing a meta-analysis. We searched the PubMed and Embase databases until August 31, 2016 for prospective observational studies investigating the elevated fasting insulin concentrations or HOMA-IR with subsequent risk of hypertension in the general population. Pooled risk ratio (RR) and 95% confidence interval (CI) of hypertension was calculated for the highest versus the lowest category of fasting insulin or HOMA-IR. Eleven studies involving 10,230 hypertension cases were identified from 55,059 participants. Meta-analysis showed that the pooled adjusted RR of hypertension was 1.54 (95% CI 1.34-1.76) for fasting insulin concentrations and 1.43 (95% CI 1.27-1.62) for HOMA-IR comparing the highest to the lowest category. Subgroup analysis results showed that the association of fasting insulin concentrations with subsequent risk of hypertension seemed more pronounced in women (RR 2.07; 95% CI 1.19-3.60) than in men (RR 1.48; 95% CI 1.17-1.88). This meta-analysis suggests that elevated fasting insulin concentrations or insulin resistance as estimated by homeostasis model assessment is independently associated with an exacerbated risk of hypertension in the general population. Early intervention of hyperinsulinemia or insulin resistance may help clinicians to identify the high risk of hypertensive population. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Mechanistic approach to generalized technical analysis of share prices and stock market indices

    Science.gov (United States)

    Ausloos, M.; Ivanova, K.

    2002-05-01

    Classical technical analysis methods of stock evolution are recalled, i.e. the notion of moving averages and momentum indicators. The moving averages lead to define death and gold crosses, resistance and support lines. Momentum indicators lead the price trend, thus give signals before the price trend turns over. The classical technical analysis investment strategy is thereby sketched. Next, we present a generalization of these tricks drawing on physical principles, i.e. taking into account not only the price of a stock but also the volume of transactions. The latter becomes a time dependent generalized mass. The notion of pressure, acceleration and force are deduced. A generalized (kinetic) energy is easily defined. It is understood that the momentum indicators take into account the sign of the fluctuations, while the energy is geared toward the absolute value of the fluctuations. They have different patterns which are checked by searching for the crossing points of their respective moving averages. The case of IBM evolution over 1990-2000 is used for illustrations.

  15. Analysis of nonlocal neural fields for both general and gamma-distributed connectivities

    Science.gov (United States)

    Hutt, Axel; Atay, Fatihcan M.

    2005-04-01

    This work studies the stability of equilibria in spatially extended neuronal ensembles. We first derive the model equation from statistical properties of the neuron population. The obtained integro-differential equation includes synaptic and space-dependent transmission delay for both general and gamma-distributed synaptic connectivities. The latter connectivity type reveals infinite, finite, and vanishing self-connectivities. The work derives conditions for stationary and nonstationary instabilities for both kernel types. In addition, a nonlinear analysis for general kernels yields the order parameter equation of the Turing instability. To compare the results to findings for partial differential equations (PDEs), two typical PDE-types are derived from the examined model equation, namely the general reaction-diffusion equation and the Swift-Hohenberg equation. Hence, the discussed integro-differential equation generalizes these PDEs. In the case of the gamma-distributed kernels, the stability conditions are formulated in terms of the mean excitatory and inhibitory interaction ranges. As a novel finding, we obtain Turing instabilities in fields with local inhibition-lateral excitation, while wave instabilities occur in fields with local excitation and lateral inhibition. Numerical simulations support the analytical results.

  16. A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades.

    Science.gov (United States)

    Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd

    2017-08-01

    The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter.

  17. MGF Approach to the Analysis of Generalized Two-Ray Fading Models

    KAUST Repository

    Rao, Milind; Lopez-Martinez, F. Javier; Alouini, Mohamed-Slim; Goldsmith, Andrea

    2015-01-01

    We analyze a class of Generalized Two-Ray (GTR) fading channels that consist of two line of sight (LOS) components with random phase plus a diffuse component. We derive a closedform expression for the moment generating function (MGF) of the signal-to-noise ratio (SNR) for this model, which greatly simplifies its analysis. This expression arises from the observation that the GTR fading model can be expressed in terms of a conditional underlying Rician distribution. We illustrate the approach to derive simple expressions for statistics and performance metrics of interest such as the amount of fading, the level crossing rate, the symbol error rate, and the ergodic capacity in GTR fading channels. We also show that the effect of considering a more general distribution for the phase difference between the LOS components has an impact on the average SNR.

  18. Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program

    Science.gov (United States)

    Schallhorn, Paul; Popok, Daniel

    1999-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  19. The prevalence of insomnia in the general population in China: A meta-analysis.

    Directory of Open Access Journals (Sweden)

    Xiao-Lan Cao

    Full Text Available This is the first meta-analysis of the pooled prevalence of insomnia in the general population of China. A systematic literature search was conducted via the following databases: PubMed, PsycINFO, EMBASE and Chinese databases (China National Knowledge Interne (CNKI, WanFang Data and SinoMed. Statistical analyses were performed using the Comprehensive Meta-Analysis program. A total of 17 studies with 115,988 participants met the inclusion criteria for the analysis. The pooled prevalence of insomnia in China was 15.0% (95% Confidence interval [CI]: 12.1%-18.5%. No significant difference was found in the prevalence between genders or across time period. The pooled prevalence of insomnia in population with a mean age of 43.7 years and older (11.6%; 95% CI: 7.5%-17.6% was significantly lower than in those with a mean age younger than 43.7 years (20.4%; 95% CI: 14.2%-28.2%. The prevalence of insomnia was significantly affected by the type of assessment tools (Q = 14.1, P = 0.001. The general population prevalence of insomnia in China is lower than those reported in Western countries but similar to those in Asian countries. Younger Chinese adults appear to suffer from more insomnia than older adults.CRD 42016043620.

  20. Termination of a fe/male transsexual patient's analysis: an example of general validity.

    Science.gov (United States)

    Quinodoz, Danielle

    2002-08-01

    The author describes the termination of an analysis, which, while relating to the particular case of a male-to-female transsexual patient, may be relevant to all analysts, particularly those whose patients need to integrate disavowed and split-off parts of themselves. The patient had undergone sex-change surgery at the age of 20. Having lived as a woman thereafter, she had asked for analysis some twenty years later. The author, who discussed the first three years of that analysis in an earlier paper, as well as her hesitation about undertaking it, considers that its termination after seven years illustrates not only the specific problems posed by transsexuals but also the general ones presented by "heterogeneous patients". To the best of her knowledge, this is the first published case history of a transsexual patient who has undergone surgery. In the author's view, the patient has acquired a new sense of internal unity based on a notion of sex differentiation in which mutual respect between the sexes has replaced confusion and mutual hate, and her quality of life has improved. On the general level, this termination shows how the reduction of paranoid-schizoid anxieties and the reintegration of split-off parts of the personality lead, as the depressive position is worked through, to a better toleration of internal contradictions, a new sense of cohesion of the self and a diminution of the fear of madness.

  1. Comparative analysis of general characteristics of ischemic stroke of BAD and non-BAD CISS subtypes.

    Science.gov (United States)

    Mei, Bin; Liu, Guang-zhi; Yang, Yang; Liu, Yu-min; Cao, Jiang-hui; Zhang, Jun-jian

    2015-12-01

    Based on the recently proposed Chinese ischemic stroke subclassification (CISS) system, intracranial branch atheromatous disease (BAD) is divided into large artery atherosclerosis (LAA) and penetrating artery disease (PAD). In the current retrospective analysis, we compared the general characteristics of BAD-LAA with BAD-PAD, BAD-LAA with non-BAD-LAA and BAD-PAD with non-BAD-PAD. The study included a total of 80 cases, including 45 cases of BAD and 35 cases of non-BAD. Subjects were classified using CISS system: BAD-LAA, BAD-PAD, non-BAD-LAA and non-BAD-PAD. In addition to analysis of general characteristics, the correlation between the factors and the two subtypes of BAD was evaluated. The number of cases included in the analysis was: 32 cases of BAD-LAA, 13 cases of BAD-PAD, 21 cases of non-BAD-LAA, and 14 cases of non-BAD-PAD. Diabetes mellitus affected more non-BAD-LAA patients than BAD-LAA patients (P=0.035). In comparison with non-BAD-PAD, patients with BAD-PAD were younger (P=0.040), had higher initial NIHSS score (PBAD, the PAD subtype was associated with smoking (OR=0.043; P=0.011), higher low-density lipoprotein (OR=5.339; P=0.029), ischemic heart disease (OR=9.383; P=0.047) and diabetes mellitus (OR=12.59; P=0.020). It was concluded that large artery atherosclerosis was the primary mechanism of BAD. The general characteristics showed no significant differences between the CISS subtypes of LAA and PAD within BAD, as well as between the BAD and non-BAD within LAA subtype. Several differences between PAD subtypes of BAD and non-BAD were revealed.

  2. Stability and bifurcation analysis of a generalized scalar delay differential equation.

    Science.gov (United States)

    Bhalekar, Sachin

    2016-08-01

    This paper deals with the stability and bifurcation analysis of a general form of equation D(α)x(t)=g(x(t),x(t-τ)) involving the derivative of order α ∈ (0, 1] and a constant delay τ ≥ 0. The stability of equilibrium points is presented in terms of the stability regions and critical surfaces. We provide a necessary condition to exist chaos in the system also. A wide range of delay differential equations involving a constant delay can be analyzed using the results proposed in this paper. The illustrative examples are provided to explain the theory.

  3. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    Science.gov (United States)

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Development of a large-scale general purpose two-phase flow analysis code

    International Nuclear Information System (INIS)

    Terasaka, Haruo; Shimizu, Sensuke

    2001-01-01

    A general purpose three-dimensional two-phase flow analysis code has been developed for solving large-scale problems in industrial fields. The code uses a two-fluid model to describe the conservation equations for two-phase flow in order to be applicable to various phenomena. Complicated geometrical conditions are modeled by FAVOR method in structured grid systems, and the discretization equations are solved by a modified SIMPLEST scheme. To reduce computing time a matrix solver for the pressure correction equation is parallelized with OpenMP. Results of numerical examples show that the accurate solutions can be obtained efficiently and stably. (author)

  5. General structure of the GRAND program for analysis of the data from a neutrino detector

    International Nuclear Information System (INIS)

    Zhigunov, V.P.; Kulikov, V.A.; Mukhin, S.A.; Naumov, V.L.; Platonov, V.G.; Spiridonov, A.A.

    1985-01-01

    The general structure of the GRAND (Global Result Analysis for Neutrino Detector) program used for geometrical and kinematic reconstruction of events recorded by a neutrino detector is considered. The detector consists of a calorimeter-target, a shower electron and γ detector and a magnetic spectrometer. While developing the GRAND program the multivariance (different types of the computers used), availability of various algorithms for solving the same problem, solution of separate particlular problems within the frames of one program are taken into account. The KERNLIB library and the HBOOK, ZBOOK, EPIO and FFREAD subroutine packages are used while creating the program as basic libraries

  6. A comparison between fault tree analysis and reliability graph with general gates

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun; Jung, Woo Sik

    2004-01-01

    Currently, level-1 probabilistic safety assessment (PSA) is performed on the basis of event tree analysis and fault tree analysis. Kim and Seong developed a new method for system reliability analysis named reliability graph with general gates (RGGG). The RGGG is an extension of conventional reliability graph, and it utilizes the transformation of system structures to equivalent Bayesian networks for quantitative calculation. The RGGG is considered to be intuitive and easy-to-use while as powerful as fault tree analysis. As an example, Kim and Seong already showed that the Bayesian network model for digital plant protection system (DPPS), which is transformed from the RGGG model for DPPS, can be shown in 1 page, while the fault tree model for DPPS consists of 64 pages of fault trees. Kim and Seong also insisted that Bayesian network model for DPPS is more intuitive because the one-to-one matching between each node in the Bayesian network model and an actual component of DPPS is possible. In this paper, we are going to give a comparison between fault tree analysis and the RGGG method with two example systems. The two example systems are the recirculation of in Korean standard nuclear power plants (KSNP) and the fault tree model developed by Rauzy

  7. Cumulative trauma, hyperarousal, and suicidality in the general population: a path analysis.

    Science.gov (United States)

    Briere, John; Godbout, Natacha; Dias, Colin

    2015-01-01

    Although trauma exposure and posttraumatic stress disorder (PTSD) both have been linked to suicidal thoughts and behavior, the underlying basis for this relationship is not clear. In a sample of 357 trauma-exposed individuals from the general population, younger participant age, cumulative trauma exposure, and all three Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, PTSD clusters (reexperiencing, avoidance, and hyperarousal) were correlated with clinical levels of suicidality. However, logistic regression analysis indicated that when all PTSD clusters were considered simultaneously, only hyperarousal continued to be predictive. A path analysis confirmed that posttraumatic hyperarousal (but not other components of PTSD) fully mediated the relationship between extent of trauma exposure and degree of suicidal thoughts and behaviors.

  8. General purpose dynamic Monte Carlo with continuous energy for transient analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sjenitzer, B. L.; Hoogenboom, J. E. [Delft Univ. of Technology, Dept. of Radiation, Radionuclide and Reactors, Mekelweg 15, 2629JB Delft (Netherlands)

    2012-07-01

    For safety assessments transient analysis is an important tool. It can predict maximum temperatures during regular reactor operation or during an accident scenario. Despite the fact that this kind of analysis is very important, the state of the art still uses rather crude methods, like diffusion theory and point-kinetics. For reference calculations it is preferable to use the Monte Carlo method. In this paper the dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli4. Also, the method is extended for use with continuous energy. The first results of Dynamic Tripoli demonstrate that this kind of calculation is indeed accurate and the results are achieved in a reasonable amount of time. With the method implemented in Tripoli it is now possible to do an exact transient calculation in arbitrary geometry. (authors)

  9. A general symplectic method for the response analysis of infinitely periodic structures subjected to random excitations

    Directory of Open Access Journals (Sweden)

    You-Wei Zhang

    Full Text Available A general symplectic method for the random response analysis of infinitely periodic structures subjected to stationary/non-stationary random excitations is developed using symplectic mathematics in conjunction with variable separation and the pseudo-excitation method (PEM. Starting from the equation of motion for a single loaded substructure, symplectic analysis is firstly used to eliminate the dependent degrees of the freedom through condensation. A Fourier expansion of the condensed equation of motion is then applied to separate the variables of time and wave number, thus enabling the necessary recurrence scheme to be developed. The random response is finally determined by implementing PEM. The proposed method is justified by comparison with results available in the literature and is then applied to a more complicated time-dependent coupled system.

  10. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  11. Analysis of general and specific combining abilities of popcorn populations, including selfed parents

    Directory of Open Access Journals (Sweden)

    José Marcelo Soriano Viana

    2003-12-01

    Full Text Available Estimation of general and specific combining ability effects in a diallel analysis of cross-pollinating populations, including the selfed parents, is presented in this work. The restrictions considered satisfy the parametric values of the GCA and SCA effects. The method is extended to self-pollinating populations (suitable for other species, without the selfed parents. The analysis of changes in population means due to inbreeding (sensitivity to inbreeding also permits to assess the predominant direction of dominance deviations and the relative genetic variability in each parent population. The methodology was used to select popcorn populations for intra- and inter-population breeding programs and for hybrid production, developed at the Federal University of Viçosa, MG, Brazil. Two yellow pearl grain popcorn populations were selected.

  12. Generalized sample entropy analysis for traffic signals based on similarity measure

    Science.gov (United States)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  13. Awareness, Analysis, and Action: Curricular Alignment for Student Success in General Chemistry

    Science.gov (United States)

    2018-01-01

    This article examines the ways that a shared faculty experience across five partner institutions led to a deep awareness of the curriculum and pedagogy of general chemistry coursework, and ultimately, to a collaborative action plan for student success. The team identified key differences and similarities in course content and instructional experiences. The comparative analysis yielded many more similarities than differences, and therefore, the team shifted its focus from “gap analysis” to an exploration of common curricular challenges. To address these challenges, the team developed content for targeted instructional resources that promoted the success of all STEM students across institutions. This article contextualizes the interinstitutional collaboration and closely examines the interactive components (awareness, analysis, and action), critical tools, and productive attitudes that undergirded the curricular alignment process of the STEM Transfer Student Success Initiative (t-STEM). PMID:29657334

  14. Cost analysis of spinal and general anesthesia for the surgical treatment of lumbar spondylosis.

    Science.gov (United States)

    Walcott, Brian P; Khanna, Arjun; Yanamadala, Vijay; Coumans, Jean-Valery; Peterfreund, Robert A

    2015-03-01

    Lumbar spine surgery is typically performed under general anesthesia, although spinal anesthesia can also be used. Given the prevalence of lumbar spine surgery, small differences in cost between the two anesthetic techniques have the potential to make a large impact on overall healthcare costs. We sought to perform a cost comparison analysis of spinal versus general anesthesia for lumbar spine operations. Following Institutional Review Board approval, a retrospective cohort study was performed from 2009-2012 on consecutive patients undergoing non-instrumented, elective lumbar spine surgery for spondylosis by a single surgeon. Each patient was evaluated for both types of anesthesia, with the decision for anesthetic method being made based on a combination of physical status, anatomical considerations, and ultimately a consensus agreement between patient, surgeon, and anesthesiologist. Patient demographics and clinical characteristics were compared between the two groups. Operating room costs were calculated whilst blinded to clinical outcomes and reported in percentage difference. General anesthesia (n=319) and spinal anesthesia (n=81) patients had significantly different median operative times of 175 ± 39.08 and 158 ± 32.75 minutes, respectively (plumbar spine surgery. It has the potential to reduce operative times, costs, and possibly, complications. Further prospective evaluation will help to validate these findings. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Current contraceptive management in Australian general practice: an analysis of BEACH data.

    Science.gov (United States)

    Mazza, Danielle; Harrison, Christopher; Taft, Angela; Brijnath, Bianca; Britt, Helena; Hobbs, Melissa; Stewart, Kay; Hussainy, Safeera

    2012-07-16

    To determine current contraceptive management by general practitioners in Australia. Analysis of data from a random sample of 3910 Australian GPs who participated in the Bettering the Evaluation and Care of Health (BEACH) survey, a continuous cross-sectional survey of GP activity, between April 2007 and March 2011. Consultations with female patients aged 12-54 years that involved all forms of contraception were analysed. GP and patient characteristics associated with the management of contraception; types of contraception used; rates of encounters involving emergency contraception. Increased age, ethnicity, Indigenous status and holding a Commonwealth Health Care Card were significantly associated with low rates of encounters involving management of contraception. The combined oral contraceptive pill was the most frequently prescribed method of contraception, with moderate prescription of long-acting reversible contraception (LARC), especially among women aged 34-54 years. Rates of consultations concerned with emergency contraception were low, but involved high rates of counselling, advice or education (48%) compared with encounters for general contraception (> 20%). A shift towards prescribing LARC, as recommended in clinical guidelines, has yet to occur in Australian general practice. Better understanding of patient and GP perspectives on contraceptive choices could lead to more effective contraceptive use.

  16. Analysis of Loss of Control Parameters for Aircraft Maneuvering in General Aviation

    Directory of Open Access Journals (Sweden)

    Sameer Ud-Din

    2018-01-01

    Full Text Available A rapid increase in the occurrence of loss of control in general aviation has raised concern in recent years. Loss of control (LOC pertains to unique characteristics in which external and internal events act in conjunction. The Federal Aviation Administration (FAA has approved an Integrated Safety Assessment Model (ISAM for evaluating safety in the National Airspace System (NAS. ISAM consists of an event sequence diagram (ESD with fault trees containing numerous parameters, which is recognized as casual risk model. In this paper, we outline an integrated risk assessment framework to model maneuvering through cross-examining external and internal events. The maneuvering is in the critical flight phase with a high number of LOC occurrences in general aviation, where highly trained and qualified pilots failed to maintain aircraft control irrespective of the preventive nature of the events. Various metrics have been presented for evaluating the significance of these parameters to identify the most important ones. The proposed sensitivity analysis considers the accident, fatality, and risk reduction frequencies that assist in the decision-making process and foresees future risks from a general aviation perspective.

  17. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    Directory of Open Access Journals (Sweden)

    Mingwu Jin

    2012-01-01

    Full Text Available Local canonical correlation analysis (CCA is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM, a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  18. Feature extraction with deep neural networks by a generalized discriminant analysis.

    Science.gov (United States)

    Stuhlsatz, André; Lippel, Jens; Zielke, Thomas

    2012-04-01

    We present an approach to feature extraction that is a generalization of the classical linear discriminant analysis (LDA) on the basis of deep neural networks (DNNs). As for LDA, discriminative features generated from independent Gaussian class conditionals are assumed. This modeling has the advantages that the intrinsic dimensionality of the feature space is bounded by the number of classes and that the optimal discriminant function is linear. Unfortunately, linear transformations are insufficient to extract optimal discriminative features from arbitrarily distributed raw measurements. The generalized discriminant analysis (GerDA) proposed in this paper uses nonlinear transformations that are learnt by DNNs in a semisupervised fashion. We show that the feature extraction based on our approach displays excellent performance on real-world recognition and detection tasks, such as handwritten digit recognition and face detection. In a series of experiments, we evaluate GerDA features with respect to dimensionality reduction, visualization, classification, and detection. Moreover, we show that GerDA DNNs can preprocess truly high-dimensional input data to low-dimensional representations that facilitate accurate predictions even if simple linear predictors or measures of similarity are used.

  19. A comparative analysis of exposure doses between the radiation workers in dental and general hospital

    International Nuclear Information System (INIS)

    Yang, Nam Hee; Chung, Woon Kwan; Dong, Kyung Rae; Ju, Yong Jin; Song, Ha Jin; Choi, Eun Jin

    2015-01-01

    Research and investigation is required for the exposure dose of radiation workers to work in the dental hospital as increasing interest in exposure dose of the dental hospital recently accordingly, study aim to minimize radiation exposure by making a follow-up study of individual exposure doses of radiation workers, analyzing the status on individual radiation exposure management, prediction the radiation disability risk levels by radiation, and alerting the workers to the danger of radiation exposure. Especially given the changes in the dental hospital radiation safety awareness conducted the study in order to minimize radiation exposure. This study performed analyses by a comparison between general and dental hospital, comparing each occupation, with the 116,220 exposure dose data by quarter and year of 5,811 subjects at general and dental hospital across South Korea from January 1, 2008 through December 31, 2012. The following are the results obtained by analyzing average values year and quarter. In term of hospital, average doses were significantly higher in general hospitals than detal ones. In terms of job, average doses were higher in radiological technologists the other workers. Especially, they showed statistically significant differences between radiological technologists than dentists. The above-mentioned results indicate that radiation workers were exposed to radiation for the past 5 years to the extent not exceeding the dose limit (maximum 50 mSv y -1 ). The limitation of this study is that radiation workers before 2008 were excluded from the study. Objective evaluation standards did not apply to the work circumstance or condition of each hospital. Therefore, it is deemed necessary to work out analysis criteria that will be used as objective evaluation standard. It will be necessary to study radiation exposure in more precise ways on the basis of objective analysis standard in the future. Should try to minimize the radiation individual dose of

  20. A comparative analysis of exposure doses between the radiation workers in dental and general hospital

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Nam Hee; Chung, Woon Kwan; Dong, Kyung Rae; Ju, Yong Jin; Song, Ha Jin [Dept. of Nuclear Engineering, Chosun University, Gwangju (Korea, Republic of); Choi, Eun Jin [Dept. of Public Health and Medicine, Dongshin University, Naju (Korea, Republic of)

    2015-02-15

    Research and investigation is required for the exposure dose of radiation workers to work in the dental hospital as increasing interest in exposure dose of the dental hospital recently accordingly, study aim to minimize radiation exposure by making a follow-up study of individual exposure doses of radiation workers, analyzing the status on individual radiation exposure management, prediction the radiation disability risk levels by radiation, and alerting the workers to the danger of radiation exposure. Especially given the changes in the dental hospital radiation safety awareness conducted the study in order to minimize radiation exposure. This study performed analyses by a comparison between general and dental hospital, comparing each occupation, with the 116,220 exposure dose data by quarter and year of 5,811 subjects at general and dental hospital across South Korea from January 1, 2008 through December 31, 2012. The following are the results obtained by analyzing average values year and quarter. In term of hospital, average doses were significantly higher in general hospitals than detal ones. In terms of job, average doses were higher in radiological technologists the other workers. Especially, they showed statistically significant differences between radiological technologists than dentists. The above-mentioned results indicate that radiation workers were exposed to radiation for the past 5 years to the extent not exceeding the dose limit (maximum 50 mSv y{sup -1}). The limitation of this study is that radiation workers before 2008 were excluded from the study. Objective evaluation standards did not apply to the work circumstance or condition of each hospital. Therefore, it is deemed necessary to work out analysis criteria that will be used as objective evaluation standard. It will be necessary to study radiation exposure in more precise ways on the basis of objective analysis standard in the future. Should try to minimize the radiation individual dose of

  1. All-sky analysis of the general relativistic galaxy power spectrum

    Science.gov (United States)

    Yoo, Jaiyul; Desjacques, Vincent

    2013-07-01

    We perform an all-sky analysis of the general relativistic galaxy power spectrum using the well-developed spherical Fourier decomposition. Spherical Fourier analysis expresses the observed galaxy fluctuation in terms of the spherical harmonics and spherical Bessel functions that are angular and radial eigenfunctions of the Helmholtz equation, providing a natural orthogonal basis for all-sky analysis of the large-scale mode measurements. Accounting for all the relativistic effects in galaxy clustering, we compute the spherical power spectrum and its covariance matrix and compare it to the standard three-dimensional power spectrum to establish a connection. The spherical power spectrum recovers the three-dimensional power spectrum at each wave number k with its angular dependence μk encoded in angular multipole l, and the contributions of the line-of-sight projection to galaxy clustering such as the gravitational lensing effect can be readily accommodated in the spherical Fourier analysis. A complete list of formulas for computing the relativistic spherical galaxy power spectrum is also presented.

  2. Occurence of internet addiction in a general population sample: a latent class analysis.

    Science.gov (United States)

    Rumpf, Hans-Jürgen; Vermulst, Ad A; Bischof, Anja; Kastirke, Nadin; Gürtler, Diana; Bischof, Gallus; Meerkerk, Gert-Jan; John, Ulrich; Meyer, Christian

    2014-01-01

    Prevalence studies of Internet addiction in the general population are rare. In addition, a lack of approved criteria hampers estimation of its occurrence. This study conducted a latent class analysis (LCA) in a large general population sample to estimate prevalence. A telephone survey was conducted based on a random digit dialling procedure including landline telephone (n=14,022) and cell phone numbers (n=1,001) in participants aged 14-64. The Compulsive Internet Use Scale (CIUS) served as the basis for a LCA used to look for subgroups representing participants with Internet addiction or at-risk use. CIUS was given to participants reporting to use the Internet for private purposes at least 1 h on a typical weekday or at least 1 h on a day at the weekend (n=8,130). A 6-class model showed best model fit and included two groups likely to represent Internet addiction and at-risk Internet use. Both groups showed less social participation and the Internet addiction group less general trust in other people. Proportions of probable Internet addiction were 1.0% (CI 0.9-1.2) among the entire sample, 2.4% (CI 1.9-3.1) in the age group 14-24, and 4.0% (CI 2.7-5.7) in the age group 14-16. No difference in estimated proportions between males and females was found. Unemployment (OR 3.13; CI 1.74-5.65) and migration background (OR 3.04; CI 2.12-4.36) were related to Internet addiction. This LCA-based study differentiated groups likely to have Internet addiction and at-risk use in the general population and provides characteristics to further define this rather new disorder. © 2013 S. Karger AG, Basel.

  3. [A totally implantable venous access device. Implantation in general or local anaesthesia? A retrospective cost analysis].

    Science.gov (United States)

    Schuld, J; Richter, S; Moussavian, M R; Kollmar, O; Schilling, M K

    2009-08-01

    Implantation of venous access port systems can be performed in local or general anesthesia. In spite of the increasing rate of interventionally implanted systems, the surgical cut-down represents a safe alternative. Thus, the question arises whether--in context to the increasing health-economic pressure--open implantation in general anesthesia is still a feasible alternative to implantation in local anesthesia regarding OR efficiency and costs. In a retrospective analysis, 993 patients receiving a totally implantable venous access device between 2001 and 2007 were evaluated regarding OR utilization, turnover times, intraoperative data and costs. Implantations in local (LA) and general anesthesia (GA) were compared. GA was performed in 762 cases (76.6 %), LA was performed in 231 patients (23.3 %). Mean operation time was similar in both groups (LA 47.27 +/- 1.40 min vs. GA 45.41 +/- 0.75 min, p = 0.244). Patients receiving local anesthesia had a significantly shorter stay in the OR unit (LA 95.9 +/- 1.78 min vs. GA 105.92 +/- 0.92 min; p cut (LA 39.57 +/- 0.69 min vs. GA 50.46 +/- 0.52 min; p material costs were significantly lower in the LA group compared with the GA group (LA: 400.72 +/- 8.25 euro vs. GA: 482.86 +/- 6.23 euro; p systems in local anesthesia is superior in comparison to the implantation under general anesthesia regarding procedural times in the OR unit and costs. With the same operation duration, but less personnel and material expenditure, implantation in local anesthesia offers a potential economic advantage by permitting faster changing times. Implantation in GA only should be performed at a special request by the patient or in difficult venous conditions. Georg Thieme Verlag Stuttgart.New York.

  4. Work-related health problems among resident immigrant workers in Italy and Spain

    Directory of Open Access Journals (Sweden)

    Aldo Rosano

    2012-09-01

    Full Text Available

    Background: in both Spain and Italy the number of immigrants has strongly increased in the last 20 years, currently representing more than the 10% of workforce in each country. The segregation of immigrants into unskilled or risky jobs brings negative consequences for their health. The objective of this study is to compare prevalence of work-related health problems between immigrants and native workers in Italy and Spain.

    Methods: data come from the Italian Labour Force Survey (n=65 779 and Spanish Working Conditions Survey (n=11 019, both conducted in 2007. We analyzed merged datasets to evaluate whether interviewees, both natives and migrants, judge their health being affected by their work conditions and, if so, which specific diseases. For migrants, we considered those coming from countries with a value of the Human Development Index lower than 0.85. Logistic regression models were used, including gender, age, and education as adjusting factors.

    Results: migrants reported skin diseases (Mantel-Haenszel pooled OR=1.49; 95%CI: 0.59-3.74 and musculoskeletal problems among those employed in agricultural sector (Mantel-Haenszel pooled OR=1.16; 95%CI: 0.69-1.96 more frequently than natives; country-specific analysis showed higher risks of musculoskeletal problems among migrants compared to the non-migrant population in Italy (OR=1.17; 95% CI: 0.48-1.59 and of respiratory problems in Spain (OR=2.02; 95%CI: 1.02-4.0. In both countries the risk of psychological stress was predominant among national workers.

    Conclusions: this collaborative study allows to strength the evidence concerning the health of migrant workers in Southern European countries.

  5. Pin-wise Reactor Analysis Based on the Generalized Equivalence Theory

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Heo, Woong; Kim, Yong Hee [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, a pin-wise reactor analysis is performed based on the generalized equivalence theory. From the conventional fuel assembly lattice calculations, pin-wise 2-group cross sections and pin DFs are generated. Based on the numerical results on a small PWR benchmark, it is observed that the pin-wise core analysis provide quite accurate prediction on the effective multiplication factor and the peak pin power error is bounded by about 3% in peripheral fuel assemblies facing the baffle-reflector. Also, it was found that relatively large pin power errors occur along the interface between clearly different fuel assemblies. It is expected that the GET-based pin-by-pin core calculation can be further developed as an advanced method for reactor analysis via improving the group constants and discontinuity factors. Recently, high-fidelity multi-dimensional analysis tools are gaining more attention because of their accurate prediction of local parameters for core design and safety assessment. In terms of accuracy, direct whole-core transport is quite promising. However, it is clear that it is still very costly in terms of the computing time and memory requirements. Another possible solution is the pin-by-pin core analysis in which only small fuel pins are homogenized and the 3-D core analysis is still performed using a low-order operator such as the diffusion theory. In this paper, a pin-by-pin core analysis is performed using the hybrid CMFD (HCMFD) method. Hybrid CMFD is a new global-local iteration method that has been developed for efficient parallel calculation of pinby-pin heterogeneous core analysis. For the HCMFD method, the one-node CMFD scheme is combined with a local two-node CMFD method in a non-linear way. Since the SPH method is iterative and SPH factors are not direction dependent, it is clear that SPH method takes more computing cost and cannot take into account the different heterogeneity and transport effects at each pin interface. Unlike the SPH

  6. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  7. Generalized functions and Fourier analysis dedicated to Stevan Pilipović on the occasion of his 65th birthday

    CERN Document Server

    Toft, Joachim; Vindas, Jasson; Wahlberg, Patrik

    2017-01-01

    This book gives an excellent and up-to-date overview on the convergence and joint progress in the fields of Generalized Functions and Fourier Analysis, notably in the core disciplines of pseudodifferential operators, microlocal analysis and time-frequency analysis. The volume is a collection of chapters addressing these fields, their interaction, their unifying concepts and their applications and is based on scientific activities related to the International Association for Generalized Functions (IAGF) and the ISAAC interest groups on Pseudo-Differential Operators (IGPDO) and on Generalized Functions (IGGF), notably on the longstanding collaboration of these groups within ISAAC.

  8. General Dynamics Convair Division approach to structural analysis of large superconducting coils

    International Nuclear Information System (INIS)

    Baldi, R.W.

    1979-01-01

    This paper describes the overall integrated analysis approach and highlights the results obtained. Most of the procedures and techniques described were developed over the past three years. Starting in late 1976, development began on high-accuracy computer codes for electromagnetic field and force analysis. This effort resulted in completion of a family of computer programs called MAGIC (MAGnetic Integration Calculation). Included in this group of programs is a post-processor called POSTMAGIC that links MAGIC to GDSAP (General Dynamics Structural Analysis Program) by automatically transferring force data. Integrating these computer programs afforded us the capability to readily analyze several different conditions that are anticipated to occur during tokamak operation. During 1977 we initiated the development of the CONVERT program that effectively links our THERMAL ANALYZER program to GDSAP by automatically transferring temperature data. The CONVERT program allowed us the capability to readily predict thermal stresses at several different time phases during the computer-simulated cooldown and warmup cycle. This feature aided us in determining the most crucial time phases and to adjust recommended operating procedure to minimize risk. (orig.)

  9. General purpose nonlinear analysis program FINAS for elevated temperature design of FBR components

    International Nuclear Information System (INIS)

    Iwata, K.; Atsumo, H.; Kano, T.; Takeda, H.

    1982-01-01

    This paper presents currently available capabilities of a general purpose finite element nonlinear analysis program FINAS (FBR Inelastic Structural Analysis System) which has been developed at Power Reactor and Nuclear Fuel Development Corporation (PNC) since 1976 to support structural design of fast breeder reactor (FBR) components in Japan. This program is capable of treating inelastic responses of arbitrary complex structures subjected to static and dynamic load histories. Various types of finite element covering rods, beams, pipes, axisymmetric, two and three dimensional solids, plates and shells, are implemented in the program. The thermal elastic-plastic creep analysis is possible for each element type, with primary emphasis on the application to FBR components subjected to sustained or cyclic loads at elevated temperature. The program permits large deformation, buckling, fracture mechanics, and dynamic analyses for some of the element types and provides a number of options for automatic mesh generation and computer graphics. Some examples including elevated temperature effects are shown to demonstrate the accuracy and the efficiency of the program

  10. Textual Analysis of General Surgery Residency Personal Statements: Topics and Gender Differences.

    Science.gov (United States)

    Ostapenko, Laura; Schonhardt-Bailey, Cheryl; Sublette, Jessica Walling; Smink, Douglas S; Osman, Nora Y

    2017-10-25

    Applicants to US general surgery residency training programs submit standardized applications. Applicants use the personal statement to express their individual rationale for a career in surgery. Our research explores common topics and gender differences within the personal statements of general surgery applicants. We analyzed the electronic residency application service personal statements of 578 applicants (containing 3,82,405 words) from Liaison Committee on Medical Education-accredited medical schools to a single ACGME-accredited general surgery program using an automated textual analysis program to identify common topics and gender differences. Using a recursive algorithm, the program identified common words and clusters, grouping them into topic classes, which are internally validated. We identified and labeled 8 statistically significant topic classes through independent review: "my story," "the art of surgery," "clinical vignettes," "why I love surgery," "residency program characteristics," "working as a team," "academics and research," and "global health and policy." Although some classes were common to all applications, we also identified gender-specific differences. Notably, women were significantly more likely than men to be represented within the class of "working as a team." (p differences between the statements of men and women. Women were more likely to discuss surgery as a team endeavor while men were more likely to focus on the details of their surgical experiences. Our work mirrors what has been found in social psychology research on gender-based differences in how men and women communicate their career goals and aspirations in other competitive professional situations. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. Responses to clinical uncertainty in Australian general practice trainees: a cross-sectional analysis.

    Science.gov (United States)

    Cooke, Georga; Tapley, Amanda; Holliday, Elizabeth; Morgan, Simon; Henderson, Kim; Ball, Jean; van Driel, Mieke; Spike, Neil; Kerr, Rohan; Magin, Parker

    2017-12-01

    Tolerance for ambiguity is essential for optimal learning and professional competence. General practice trainees must be, or must learn to be, adept at managing clinical uncertainty. However, few studies have examined associations of intolerance of uncertainty in this group. The aim of this study was to establish levels of tolerance of uncertainty in Australian general practice trainees and associations of uncertainty with demographic, educational and training practice factors. A cross-sectional analysis was performed on the Registrar Clinical Encounters in Training (ReCEnT) project, an ongoing multi-site cohort study. Scores on three of the four independent subscales of the Physicians' Reaction to Uncertainty (PRU) instrument were analysed as outcome variables in linear regression models with trainee and practice factors as independent variables. A total of 594 trainees contributed data on a total of 1209 occasions. Trainees in earlier training terms had higher scores for 'Anxiety due to uncertainty', 'Concern about bad outcomes' and 'Reluctance to disclose diagnosis/treatment uncertainty to patients'. Beyond this, findings suggest two distinct sets of associations regarding reaction to uncertainty. Firstly, affective aspects of uncertainty (the 'Anxiety' and 'Concern' subscales) were associated with female gender, less experience in hospital prior to commencing general practice training, and graduation overseas. Secondly, a maladaptive response to uncertainty (the 'Reluctance to disclose' subscale) was associated with urban practice, health qualifications prior to studying medicine, practice in an area of higher socio-economic status, and being Australian-trained. This study has established levels of three measures of trainees' responses to uncertainty and associations with these responses. The current findings suggest differing 'phenotypes' of trainees with high 'affective' responses to uncertainty and those reluctant to disclose uncertainty to patients. More

  12. Generalized MHD for numerical stability analysis of high-performance plasmas in tokamaks

    International Nuclear Information System (INIS)

    Mikhailovskii, A.B.

    1998-01-01

    provide a basis for development of generalized MHD codes for numerical stability analysis of high-performance plasmas in tokamaks. (author)

  13. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  14. Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods

    International Nuclear Information System (INIS)

    Procaccia, H.; Villain, B.; Clarotti, C.A.

    1996-01-01

    EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL'94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors)

  15. How to Measure Quality of Service Using Unstructured Data Analysis: A General Method Design

    Directory of Open Access Journals (Sweden)

    Lucie Sperková,

    2015-10-01

    Full Text Available The aim of the paper is to design a general method usable for measuring the quality of the service from the customer’s point of view with the help of content analytics. Large amount of unstructured data is created by customers of the service. This data can provide a valuable feedback from the service usage. Customers talk among themselves about their experiences and feelings from consumption of the service. The design of the method is based on a systematic literature review in the area of the service quality and unstructured data analysis. Analytics and quality measurement models are collected and critically evaluated regarding their potential use for measuring IT service quality. The method can be used by IT service provider to measure and monitor service quality based on World-of-Mouth in order to continual service improvement.

  16. Normal Mode Analysis to a Poroelastic Half-Space Problem under Generalized Thermoelasticity

    Directory of Open Access Journals (Sweden)

    Chunbao Xiong

    Full Text Available Abstract The thermo-hydro-mechanical problems associated with a poroelastic half-space soil medium with variable properties under generalized thermoelasticity theory were investigated in this study. By remaining faithful to Biot’s theory of dynamic poroelasticity, we idealized the foundation material as a uniform, fully saturated, poroelastic half-space medium. We first subjected this medium to time harmonic loads consisting of normal or thermal loads, then investigated the differences between the coupled thermohydro-mechanical dynamic models and the thermo-elastic dynamic models. We used normal mode analysis to solve the resulting non-dimensional coupled equations, then investigated the effects that non-dimensional vertical displacement, excess pore water pressure, vertical stress, and temperature distribution exerted on the poroelastic half-space medium and represented them graphically.

  17. Higher moments method for generalized Pareto distribution in flood frequency analysis

    Science.gov (United States)

    Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.

    2017-08-01

    The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.

  18. Generalized Tensor Analysis Model for Multi-Subcarrier Analog Optical Systems

    DEFF Research Database (Denmark)

    Zhao, Ying; Yu, Xianbin; Zheng, Xiaoping

    2011-01-01

    We propose and develop a general tensor analysis framework for a subcarrier multiplex analog optical fiber link for applications in microwave photonics. The goal of this work is to construct an uniform method to address nonlinear distortions of a discrete frequency transmission system. We employ....... In addition, it is demonstrated that each corresponding tensor is formally determined by device structures, which allows for a synthesized study of device combinations more systematically. For implementing numerical methods, the practical significance of the tensor model is it simplifies the derivation...... details compared with series-based approaches by hiding the underlying multi-fold summation and index operation. The integrity of the proposed methodology is validated by investigating the classical intensity modulated system. Furthermore, to give an application model of the tensor formalism, we make...

  19. A fully general and adaptive inverse analysis method for cementitious materials

    DEFF Research Database (Denmark)

    Jepsen, Michael S.; Damkilde, Lars; Lövgren, Ingemar

    2016-01-01

    The paper presents an adaptive method for inverse determination of the tensile σ - w relationship, direct tensile strength and Young’s modulus of cementitious materials. The method facilitates an inverse analysis with a multi-linear σ - w function. Usually, simple bi- or tri-linear functions...... are applied when modeling the fracture mechanisms in cementitious materials, but the vast development of pseudo-strain hardening, fiber reinforced cementitious materials require inverse methods, capable of treating multi-linear σ - w functions. The proposed method is fully general in the sense that it relies...... of notched specimens and simulated data from a nonlinear hinge model. The paper shows that the results obtained by means of the proposed method is independent on the initial shape of the σ - w function and the initial guess of the tensile strength. The method provides very accurate fits, and the increased...

  20. Implementation of the dynamic Monte Carlo method for transient analysis in the general purpose code Tripoli

    Energy Technology Data Exchange (ETDEWEB)

    Sjenitzer, Bart L.; Hoogenboom, J. Eduard, E-mail: B.L.Sjenitzer@TUDelft.nl, E-mail: J.E.Hoogenboom@TUDelft.nl [Delft University of Technology (Netherlands)

    2011-07-01

    A new Dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli 4.6.1. With this new method incorporated, a general purpose code can be used for safety transient analysis, such as the movement of a control rod or in an accident scenario. To make the Tripoli code ready for calculating on dynamic systems, the Tripoli scheme had to be altered to incorporate time steps, to include the simulation of delayed neutron precursors and to simulate prompt neutron chains. The modified Tripoli code is tested on two sample cases, a steady-state system and a subcritical system and the resulting neutron fluxes behave just as expected. The steady-state calculation has a constant neutron flux over time and this result shows the stability of the calculation. The neutron flux stays constant with acceptable variance. This also shows that the starting conditions are determined correctly. The sub-critical case shows that the code can also handle dynamic systems with a varying neutron flux. (author)

  1. Implementation of the dynamic Monte Carlo method for transient analysis in the general purpose code Tripoli

    International Nuclear Information System (INIS)

    Sjenitzer, Bart L.; Hoogenboom, J. Eduard

    2011-01-01

    A new Dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli 4.6.1. With this new method incorporated, a general purpose code can be used for safety transient analysis, such as the movement of a control rod or in an accident scenario. To make the Tripoli code ready for calculating on dynamic systems, the Tripoli scheme had to be altered to incorporate time steps, to include the simulation of delayed neutron precursors and to simulate prompt neutron chains. The modified Tripoli code is tested on two sample cases, a steady-state system and a subcritical system and the resulting neutron fluxes behave just as expected. The steady-state calculation has a constant neutron flux over time and this result shows the stability of the calculation. The neutron flux stays constant with acceptable variance. This also shows that the starting conditions are determined correctly. The sub-critical case shows that the code can also handle dynamic systems with a varying neutron flux. (author)

  2. General partial wave analysis of the decay of a hyperon of spin 1/2

    International Nuclear Information System (INIS)

    Lee, T.D.; Yang, C.N.

    1983-01-01

    This note is to consider the general problem of the decay of a hyperon of spin 1/2 into a pion and a nucleon under the general assumption of possible violations of parity conservation, charge-conjugation invariance, and time-reversal invariance. The discussion is in essence a partial wave analysis of the decay phenomena and is independent of the dynamics of the decay. Nonrelativistic approximations are not made on either of the decay products. In the reference system in which the hyperon is at rest there are two possible final states of the pion-nucleon system:s/sub 1/2/ and p/sub 1/2/. Denoting the amplitudes of these two states by A and B, one observes that the decay is physically characterized by three real constants specifying the magnitudes and the relative phase between these amplitudes. One of these constants can be taken to be absolute value a 2 + absolute value B 2 , and is evidently proportional to the decay probability per unit time. The other two constants are best defined in terms of experimentally measurable quantities. They discuss three types of experiments: (a) The angular distribution of the decay pion from a completely polarized hyperon at rest. (b) The longitudinal polarization of the nucleon emitted in the decay of unpolarized hyperons at rest. (c) Transverse polarization of the nucleon emitted in a given direction in the decay of a polarized hyperon

  3. Patterns of stigma toward schizophrenia among the general population: a latent profile analysis.

    Science.gov (United States)

    Loch, Alexandre A; Wang, Yuan-Pang; Guarniero, Francisco B; Lawson, Fabio L; Hengartner, Michael P; Rössler, Wulf; Gattaz, Wagner F

    2014-09-01

    Our purpose was to assess stigma toward schizophrenia in a representative sample of the Brazilian general population. The sample consisted of 1015 individuals interviewed by telephone. A vignette describing someone with schizophrenia was read, and four stigma aspects regarding this hypothetical individual were assessed: stereotypes, restrictions, perceived prejudice and social distance. Latent profile analysis searched for stigma profiles among the sample. Multinomial logistic regression was used to find correlates of each class. Four stigma profiles were found; 'no stigma' individuals (n = 251) mostly displayed positive opinions. 'Labelers' (n = 222) scored high on social distance; they more often had familial contact with mental illness and more often labeled the vignette's disorder as schizophrenia. 'Discriminators', the group with the majority of individuals (n = 302), showed high levels of stigmatizing beliefs in all dimensions; discriminators were significantly older. 'Unobtrusive stigma' individuals (n = 240) seemed to demonstrate uncertainty or low commitment since they mostly answered items with the middle/impartial option. Some findings from the international literature were replicated; however, familial contact increased stigma, possibly denoting a locally modulated determinant. Hereby, our study also adds important cross-cultural data by showing that stigma toward schizophrenia is high in a Latin-American setting. We highlight the importance of analyzing the general population as a heterogeneous group, aiming to better elaborate anti-stigma campaigns. © The Author(s) 2013.

  4. Perioperative factors predicting poor outcome in elderly patients following emergency general surgery: a multivariate regression analysis

    Science.gov (United States)

    Lees, Mackenzie C.; Merani, Shaheed; Tauh, Keerit; Khadaroo, Rachel G.

    2015-01-01

    Background Older adults (≥ 65 yr) are the fastest growing population and are presenting in increasing numbers for acute surgical care. Emergency surgery is frequently life threatening for older patients. Our objective was to identify predictors of mortality and poor outcome among elderly patients undergoing emergency general surgery. Methods We conducted a retrospective cohort study of patients aged 65–80 years undergoing emergency general surgery between 2009 and 2010 at a tertiary care centre. Demographics, comorbidities, in-hospital complications, mortality and disposition characteristics of patients were collected. Logistic regression analysis was used to identify covariate-adjusted predictors of in-hospital mortality and discharge of patients home. Results Our analysis included 257 patients with a mean age of 72 years; 52% were men. In-hospital mortality was 12%. Mortality was associated with patients who had higher American Society of Anesthesiologists (ASA) class (odds ratio [OR] 3.85, 95% confidence interval [CI] 1.43–10.33, p = 0.008) and in-hospital complications (OR 1.93, 95% CI 1.32–2.83, p = 0.001). Nearly two-thirds of patients discharged home were younger (OR 0.92, 95% CI 0.85–0.99, p = 0.036), had lower ASA class (OR 0.45, 95% CI 0.27–0.74, p = 0.002) and fewer in-hospital complications (OR 0.69, 95% CI 0.53–0.90, p = 0.007). Conclusion American Society of Anesthesiologists class and in-hospital complications are perioperative predictors of mortality and disposition in the older surgical population. Understanding the predictors of poor outcome and the importance of preventing in-hospital complications in older patients will have important clinical utility in terms of preoperative counselling, improving health care and discharging patients home. PMID:26204143

  5. A bibliometric analysis of Australian general practice publications from 1980 to 2007 using PubMed.

    Science.gov (United States)

    Mendis, Kumara; Kidd, Michael R; Schattner, Peter; Canalese, Joseph

    2010-01-01

    We analysed Australian general practice (GP) publications in PubMed from 1980 to 2007 to determine journals, authors, publication types, national health priority areas (NHPA) and compared the results with those from three specialties (public health, cardiology and medical informatics) and two countries (the UK and New Zealand). Australian GP publications were downloaded in MEDLINE format using PubMed queries and were written to a Microsoft Access database using a software application. Search Query Language and online PubMed queries were used for further analysis. There were 4777 publications from 1980 to 2007. Australian Family Physician (38.1%) and the Medical Journal of Australia (17.6%) contributed 55.7% of publications. Reviews (12.7%), letters (6.6%), clinical trials (6.5%) and systematic reviews (5%) were the main PubMed publication types. Thirty five percent of publications addressed National Health Priority Areas with material on mental health (13.7%), neoplasms (6.5%) and cardiovascular conditions (5.9%). The comparable numbers of publications for the three specialties were: public health - 80 911, cardiology - 15 130 and medical informatics - 3338; total country GP comparisons were: UK - 14 658 and New Zealand - 1111. Australian GP publications have shown an impressive growth from 1980 to 2007 with a 15-fold increase. This increase may be due in part to the actions of the Australian government over the past decade to financially support research in primary care, as well as the maturing of academic general practice. This analysis can assist governments, researchers, policy makers and others to target resources so that further developments can be encouraged, supported and monitored.

  6. Matrix precipitation: a general strategy to eliminate matrix interference for pharmaceutical toxic impurities analysis.

    Science.gov (United States)

    Yang, Xiaojing; Xiong, Xuewu; Cao, Ji; Luan, Baolei; Liu, Yongjun; Liu, Guozhu; Zhang, Lei

    2015-01-30

    Matrix interference, which can lead to false positive/negative results, contamination of injector or separation column, incompatibility between sample solution and the selected analytical instrument, and response inhibition or even quenching, is commonly suffered for the analysis of trace level toxic impurities in drug substance. In this study, a simple matrix precipitation strategy is proposed to eliminate or minimize the above stated matrix interference problems. Generally, a sample of active pharmaceutical ingredients (APIs) is dissolved in an appropriate solvent to achieve the desired high concentration and then an anti-solvent is added to precipitate the matrix substance. As a result, the target analyte is extracted into the mixed solution with very less residual of APIs. This strategy has the characteristics of simple manipulation, high recovery and excellent anti-interference capability. It was found that the precipitation ratio (R, representing the ability to remove matrix substance) and the proportion of solvent (the one used to dissolve APIs) in final solution (P, affecting R and also affecting the method sensitivity) are two important factors of the precipitation process. The correlation between R and P was investigated by performing precipitation with various APIs in different solvent/anti-solvent systems. After a detailed mathematical reasoning process, P=20% was proved to be an effective and robust condition to perform the precipitation strategy. The precipitation method with P=20% can be used as a general strategy for toxic impurity analysis in APIs. Finally, several typical examples are described in this article, where the challenging matrix interference issues have been resolved successfully. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. A Rasch and confirmatory factor analysis of the General Health Questionnaire (GHQ - 12

    Directory of Open Access Journals (Sweden)

    Velikova Galina

    2010-04-01

    Full Text Available Abstract Background The General Health Questionnaire (GHQ - 12 was designed as a short questionnaire to assess psychiatric morbidity. Despite the fact that studies have suggested a number of competing multidimensional factor structures, it continues to be largely used as a unidimensional instrument. This may have an impact on the identification of psychiatric morbidity in target populations. The aim of this study was to explore the dimensionality of the GHQ-12 and to evaluate a number of alternative models for the instrument. Methods The data were drawn from a large heterogeneous sample of cancer patients. The Partial Credit Model (Rasch was applied to the 12-item GHQ. Item misfit (infit mean square ≥ 1.3 was identified, misfitting items removed and unidimensionality and differential item functioning (age, gender, and treatment aims were assessed. The factor structures of the various alternative models proposed in the literature were explored and optimum model fit evaluated using Confirmatory Factor Analysis. Results The Rasch analysis of the 12-item GHQ identified six misfitting items. Removal of these items produced a six-item instrument which was not unidimensional. The Rasch analysis of an 8-item GHQ demonstrated two unidimensional structures corresponding to Anxiety/Depression and Social Dysfunction. No significant differential item functioning was observed by age, gender and treatment aims for the six- and eight-item GHQ. Two models competed for best fit from the confirmatory factor analysis, namely the GHQ-8 and Hankin's (2008 unidimensional model, however, the GHQ-8 produced the best overall fit statistics. Conclusions The results are consistent with the evidence that the GHQ-12 is a multi-dimensional instrument. Use of the summated scores for the GHQ-12 could potentially lead to an incorrect assessment of patients' psychiatric morbidity. Further evaluation of the GHQ-12 with different target populations is warranted.

  8. Analysis of Students’ Missed Organic Chemistry Quiz Questions that Stress the Importance of Prior General Chemistry Knowledge

    OpenAIRE

    Julie Ealy

    2018-01-01

    A concern about students’ conceptual difficulties in organic chemistry prompted this study. It was found that prior knowledge from general chemistry was critical in organic chemistry, but what were some of the concepts that comprised that prior knowledge? Therefore an analysis of four years of organic chemistry quiz data was undertaken. Multiple general chemistry concepts were revealed that are essential prior knowledge in organic chemistry. The general chemistry concepts that were foun...

  9. Impact of General Chemistry on Student Achievement and Progression to Subsequent Chemistry Courses: A Regression Discontinuity Analysis

    Science.gov (United States)

    Shultz, Ginger V.; Gottfried, Amy C.; Winschel, Grace A.

    2015-01-01

    General chemistry is a gateway course that impacts the STEM trajectory of tens of thousands of students each year, and its role in the introductory curriculum as well as its pedagogical design are the center of an ongoing debate. To investigate the role of general chemistry in the curriculum, we report the results of a posthoc analysis of 10 years…

  10. A general framework for the regression analysis of pooled biomarker assessments.

    Science.gov (United States)

    Liu, Yan; McMahan, Christopher; Gallagher, Colin

    2017-07-10

    As a cost-efficient data collection mechanism, the process of assaying pooled biospecimens is becoming increasingly common in epidemiological research; for example, pooling has been proposed for the purpose of evaluating the diagnostic efficacy of biological markers (biomarkers). To this end, several authors have proposed techniques that allow for the analysis of continuous pooled biomarker assessments. Regretfully, most of these techniques proceed under restrictive assumptions, are unable to account for the effects of measurement error, and fail to control for confounding variables. These limitations are understandably attributable to the complex structure that is inherent to measurements taken on pooled specimens. Consequently, in order to provide practitioners with the tools necessary to accurately and efficiently analyze pooled biomarker assessments, herein, a general Monte Carlo maximum likelihood-based procedure is presented. The proposed approach allows for the regression analysis of pooled data under practically all parametric models and can be used to directly account for the effects of measurement error. Through simulation, it is shown that the proposed approach can accurately and efficiently estimate all unknown parameters and is more computational efficient than existing techniques. This new methodology is further illustrated using monocyte chemotactic protein-1 data collected by the Collaborative Perinatal Project in an effort to assess the relationship between this chemokine and the risk of miscarriage. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. s-core network decomposition: A generalization of k-core analysis to weighted networks

    Science.gov (United States)

    Eidsaa, Marius; Almaas, Eivind

    2013-12-01

    A broad range of systems spanning biology, technology, and social phenomena may be represented and analyzed as complex networks. Recent studies of such networks using k-core decomposition have uncovered groups of nodes that play important roles. Here, we present s-core analysis, a generalization of k-core (or k-shell) analysis to complex networks where the links have different strengths or weights. We demonstrate the s-core decomposition approach on two random networks (ER and configuration model with scale-free degree distribution) where the link weights are (i) random, (ii) correlated, and (iii) anticorrelated with the node degrees. Finally, we apply the s-core decomposition approach to the protein-interaction network of the yeast Saccharomyces cerevisiae in the context of two gene-expression experiments: oxidative stress in response to cumene hydroperoxide (CHP), and fermentation stress response (FSR). We find that the innermost s-cores are (i) different from innermost k-cores, (ii) different for the two stress conditions CHP and FSR, and (iii) enriched with proteins whose biological functions give insight into how yeast manages these specific stresses.

  12. A general model for preload calculation and stiffness analysis for combined angular contact ball bearings

    Science.gov (United States)

    Zhang, Jinhua; Fang, Bin; Hong, Jun; Wan, Shaoke; Zhu, Yongsheng

    2017-12-01

    The combined angular contact ball bearings are widely used in automatic, aerospace and machine tools, but few researches on the combined angular contact ball bearings have been reported. It is shown that the preload and stiffness of combined bearings are mutual influenced rather than simply the superposition of multiple single bearing, therefore the characteristic calculation of combined bearings achieved by coupling the load and deformation analysis of a single bearing. In this paper, based on the Jones quasi-static model and stiffness analytical model, a new iterative algorithm and model are proposed for the calculation of combined bearings preload and stiffness, and the dynamic effects include centrifugal force and gyroscopic moment have to be considered. It is demonstrated that the new method has general applicability, the preload factors of combined bearings are calculated according to the different design preloads, and the static and dynamic stiffness for various arrangements of combined bearings are comparatively studied and analyzed, and the influences of the design preload magnitude, axial load and rotating speed are discussed in detail. Besides, the change rule of dynamic contact angles of combined bearings with respect to the rotating speed is also discussed. The results show that bearing arrangement modes, rotating speed and design preload magnitude have a significant influence on the preload and stiffness of combined bearings. The proposed formulation provides a useful tool in dynamic analysis of the complex bearing-rotor system.

  13. Bonding analysis of planar hypercoordinate atoms via the generalized BLW-LOL.

    Science.gov (United States)

    Bomble, Laetitia; Steinmann, Stephan N; Perez-Peralta, Nancy; Merino, Gabriel; Corminboeuf, Clemence

    2013-10-05

    The multicenter bonding pattern of the intriguing hexa-, hepta-, and octacoordinate boron wheel series (e.g., CB62-, CB7-, B82-, and SiB8 as well as the experimentally detected CB7- isomer) is revised using the block-localized wave function analyzed by the localized orbital locator (BLW-LOL). The more general implementation of BLW combined with the LOL scalar field is not restricted to the analysis of the out-of-plane π-system but can also provide an intuitive picture of the σ-radial delocalization and of the role of the central atom. The results confirm the presence of a π-ring current pattern similar to that of benzene. In addition, the LOLπ isosurfaces along with the maximum intensity in the ΔLOL profiles located above and below the ring suggest that the central atom plays a minor role in the π-delocalized bonding pattern. Finally, the analysis of the σ-framework in these boron wheels is in line with a moderated inner cyclic rather than disk-type delocalization. Copyright © 2013 Wiley Periodicals, Inc.

  14. Generalized modeling of multi-component vaporization/condensation phenomena for multi-phase-flow analysis

    International Nuclear Information System (INIS)

    Morita, K.; Fukuda, K.; Tobita, Y.; Kondo, Sa.; Suzuki, T.; Maschek, W.

    2003-01-01

    A new multi-component vaporization/condensation (V/C) model was developed to provide a generalized model for safety analysis codes of liquid metal cooled reactors (LMRs). These codes simulate thermal-hydraulic phenomena of multi-phase, multi-component flows, which is essential to investigate core disruptive accidents of LMRs such as fast breeder reactors and accelerator driven systems. The developed model characterizes the V/C processes associated with phase transition by employing heat transfer and mass-diffusion limited models for analyses of relatively short-time-scale multi-phase, multi-component hydraulic problems, among which vaporization and condensation, or simultaneous heat and mass transfer, play an important role. The heat transfer limited model describes the non-equilibrium phase transition processes occurring at interfaces, while the mass-diffusion limited model is employed to represent effects of non-condensable gases and multi-component mixture on V/C processes. Verification of the model and method employed in the multi-component V/C model of a multi-phase flow code was performed successfully by analyzing a series of multi-bubble condensation experiments. The applicability of the model to the accident analysis of LMRs is also discussed by comparison between steam and metallic vapor systems. (orig.)

  15. Interactive general-purpose function minimization for the analysis of neutron scattering data

    International Nuclear Information System (INIS)

    Abel, W.

    1981-12-01

    An on-line graphic display facility has been employed mainly for the peak analysis of time-of-flight spectra measured by inelastic scattering of thermal neutrons. But it is useful also for the analysis of spectra measured with triple axis spectrometers and of diffraction patterns. The spectral lines may be fitted by the following analytical shape functions: (i) a Gaussian, (ii) a Lorentzian, or (iii) a convolution of a Lorentzian with a Gaussian, plus a background continuum. Data reduction or correction may be invoked optionally. For more general applications in analysing of numerical data there is also the possibility to define the analytical shape functions by the user. Three different minimization methods are available which may be used alone or in combination. The parameters of the shape functions may be kept fixed or variable during the minimization steps. The width of variation may be restricted. Global correlation coefficients, parameter errors and the chi 2 are displayed to inform the user about the quality of the fit. A detailed description of the program operations is given. The programs are written in FORTRAN IV and use an IBM/2250-1 graphic display unit. (orig.) [de

  16. Multiple Criteria Decision Making by Generalized Data Envelopment Analysis Introducing Aspiration Level Method

    International Nuclear Information System (INIS)

    Yun, Yeboon; Arakawa, Masao; Hiroshi, Ishikawa; Nakayama, Hirotaka

    2002-01-01

    It has been proved in problems with 2-objective functions that genetic algorithms (GAs) are well utilized for generating Pareto optimal solutions, and then decision making can be easily performed on the basis of visualized Pareto optimal solutions. However, GAs are difficult to visualize Pareto optimal solutions in cases in which the number of objective function is more than 4. Hence, it is trouble some to grasp the trade-off among many objective functions, and decision makers hesitate to choose a final solution from a number of Pareto optimal solutions. In order to solve these problems, we suggest an aspiration level approach to the method using the generalized data envelopment analysis and GAs. We show that the proposed method supports decision makers to choose their desirable solution from many Pareto optimal solutions. Furthermore, it will be seen that engineering design can be effectively done by the proposed method, which makes generation of several Pareto optimal solutions close to the aspiration level and trade-off analysis easily

  17. Systematic review and meta-analysis of dropout rates in individual psychotherapy for generalized anxiety disorder.

    Science.gov (United States)

    Gersh, Elon; Hallford, David J; Rice, Simon M; Kazantzis, Nikolaos; Gersh, Hannah; Gersh, Benji; McCarty, Carolyn A

    2017-12-01

    Despite being a relatively prevalent and debilitating disorder, Generalized Anxiety Disorder (GAD) is the second least studied anxiety disorder and among the most difficult to treat. Dropout from psychotherapy is concerning as it is associated with poorer outcomes, leads to service inefficiencies and can disproportionately affect disadvantaged populations. No study to date has calculated a weighted mean dropout rate for GAD and explored associated correlates. A systematic review was conducted using PsycINFO, Medline and Embase databases, identifying studies investigating individual psychotherapies for adults with GAD. Forty-five studies, involving 2224 participants, were identified for meta-analysis. The weighted mean dropout rate was 16.99% (95% confidence interval 14.42%-19.91%). The Q-statistic indicated significant heterogeneity among studies. Moderator analysis and meta-regressions indicated no statistically significant effect of client age, sex, symptom severity, comorbidity, treatment type, study type (randomized trial or not), study quality, number of sessions or therapist experience. In research investigating psychotherapy for GAD, approximately one in six clients can be expected to drop out of treatment. Dropout rate was not significantly moderated by the client, therapist or treatment variables investigated. Future research should specify the definition of dropout, reasons for dropout and associated correlates to assist the field's progression. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. General MACOS Interface for Modeling and Analysis for Controlled Optical Systems

    Science.gov (United States)

    Sigrist, Norbert; Basinger, Scott A.; Redding, David C.

    2012-01-01

    The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.

  19. Revisiting of Multiscale Static Analysis of Notched Laminates Using the Generalized Method of Cells

    Science.gov (United States)

    Naghipour Ghezeljeh, Paria; Arnold, Steven M.; Pineda, Evan J.

    2016-01-01

    Composite material systems generally exhibit a range of behavior on different length scales (from constituent level to macro); therefore, a multiscale framework is beneficial for the design and engineering of these material systems. The complex nature of the observed composite failure during experiments suggests the need for a three-dimensional (3D) multiscale model to attain a reliable prediction. However, the size of a multiscale three-dimensional finite element model can become prohibitively large and computationally costly. Two-dimensional (2D) models are preferred due to computational efficiency, especially if many different configurations have to be analyzed for an in-depth damage tolerance and durability design study. In this study, various 2D and 3D multiscale analyses will be employed to conduct a detailed investigation into the tensile failure of a given multidirectional, notched carbon fiber reinforced polymer laminate. Threedimensional finite element analysis is typically considered more accurate than a 2D finite element model, as compared with experiments. Nevertheless, in the absence of adequate mesh refinement, large differences may be observed between a 2D and 3D analysis, especially for a shear-dominated layup. This observed difference has not been widely addressed in previous literature and is the main focus of this paper.

  20. Transition towards a low carbon economy: A computable general equilibrium analysis for Poland

    International Nuclear Information System (INIS)

    Böhringer, Christoph; Rutherford, Thomas F.

    2013-01-01

    In the transition to sustainable economic structures the European Union assumes a leading role with its climate and energy package which sets ambitious greenhouse gas emission reduction targets by 2020. Among EU Member States, Poland with its heavy energy system reliance on coal is particularly worried on the pending trade-offs between emission regulation and economic growth. In our computable general equilibrium analysis of the EU climate and energy package we show that economic adjustment cost for Poland hinge crucially on restrictions to where-flexibility of emission abatement, revenue recycling, and technological options in the power system. We conclude that more comprehensive flexibility provisions at the EU level and a diligent policy implementation at the national level could achieve the transition towards a low carbon economy at little cost thereby broadening societal support. - Highlights: ► Economic impact assessment of the EU climate and energy package for Poland. ► Sensitivity analysis on where-flexibility, revenue recycling and technology choice. ► Application of a hybrid bottom-up, top-down CGE model

  1. Psychometric analysis of the Swedish version of the General Medical Council's multi source feedback questionnaires.

    Science.gov (United States)

    Olsson, Jan-Eric; Wallentin, Fan Yang; Toth-Pal, Eva; Ekblad, Solvig; Bertilson, Bo Christer

    2017-07-10

    To determine the internal consistency and the underlying components of our translated and adapted Swedish version of the General Medical Council's multisource feedback questionnaires (GMC questionnaires) for physicians and to confirm which aspects of good medical practice the latent variable structure reflected. From October 2015 to March 2016, residents in family medicine in Sweden were invited to participate in the study and to use the Swedish version to perform self-evaluations and acquire feedback from both their patients and colleagues. The validation focused on internal consistency and construct validity. Main outcome measures were Cronbach's alpha coefficients, Principal Component Analysis, and Confirmatory Factor Analysis indices. A total of 752 completed questionnaires from patients, colleagues, and residents were analysed. Of these, 213 comprised resident self-evaluations, 336 were feedback from residents' patients, and 203 were feedback from residents' colleagues. Cronbach's alpha coefficients of the scores were 0.88 from patients, 0.93 from colleagues, and 0.84 in the self-evaluations. The Confirmatory Factor Analysis validated two models that fit the data reasonably well and reflected important aspects of good medical practice. The first model had two latent factors for patient-related items concerning empathy and consultation management, and the second model had five latent factors for colleague-related items, including knowledge and skills, attitude and approach, reflection and development, teaching, and trust. The current Swedish version seems to be a reliable and valid tool for formative assessment for resident physicians and their supervisors. This needs to be verified in larger samples.

  2. Resting-state theta band connectivity and graph analysis in generalized social anxiety disorder.

    Science.gov (United States)

    Xing, Mengqi; Tadayonnejad, Reza; MacNamara, Annmarie; Ajilore, Olusola; DiGangi, Julia; Phan, K Luan; Leow, Alex; Klumpp, Heide

    2017-01-01

    Functional magnetic resonance imaging (fMRI) resting-state studies show generalized social anxiety disorder (gSAD) is associated with disturbances in networks involved in emotion regulation, emotion processing, and perceptual functions, suggesting a network framework is integral to elucidating the pathophysiology of gSAD. However, fMRI does not measure the fast dynamic interconnections of functional networks. Therefore, we examined whole-brain functional connectomics with electroencephalogram (EEG) during resting-state. Resting-state EEG data was recorded for 32 patients with gSAD and 32 demographically-matched healthy controls (HC). Sensor-level connectivity analysis was applied on EEG data by using Weighted Phase Lag Index (WPLI) and graph analysis based on WPLI was used to determine clustering coefficient and characteristic path length to estimate local integration and global segregation of networks. WPLI results showed increased oscillatory midline coherence in the theta frequency band indicating higher connectivity in the gSAD relative to HC group during rest. Additionally, WPLI values positively correlated with state anxiety levels within the gSAD group but not the HC group. Our graph theory based connectomics analysis demonstrated increased clustering coefficient and decreased characteristic path length in theta-based whole brain functional organization in subjects with gSAD compared to HC. Theta-dependent interconnectivity was associated with state anxiety in gSAD and an increase in information processing efficiency in gSAD (compared to controls). Results may represent enhanced baseline self-focused attention, which is consistent with cognitive models of gSAD and fMRI studies implicating emotion dysregulation and disturbances in task negative networks (e.g., default mode network) in gSAD.

  3. Node-Splitting Generalized Linear Mixed Models for Evaluation of Inconsistency in Network Meta-Analysis.

    Science.gov (United States)

    Yu-Kang, Tu

    2016-12-01

    Network meta-analysis for multiple treatment comparisons has been a major development in evidence synthesis methodology. The validity of a network meta-analysis, however, can be threatened by inconsistency in evidence within the network. One particular issue of inconsistency is how to directly evaluate the inconsistency between direct and indirect evidence with regard to the effects difference between two treatments. A Bayesian node-splitting model was first proposed and a similar frequentist side-splitting model has been put forward recently. Yet, assigning the inconsistency parameter to one or the other of the two treatments or splitting the parameter symmetrically between the two treatments can yield different results when multi-arm trials are involved in the evaluation. We aimed to show that a side-splitting model can be viewed as a special case of design-by-treatment interaction model, and different parameterizations correspond to different design-by-treatment interactions. We demonstrated how to evaluate the side-splitting model using the arm-based generalized linear mixed model, and an example data set was used to compare results from the arm-based models with those from the contrast-based models. The three parameterizations of side-splitting make slightly different assumptions: the symmetrical method assumes that both treatments in a treatment contrast contribute to inconsistency between direct and indirect evidence, whereas the other two parameterizations assume that only one of the two treatments contributes to this inconsistency. With this understanding in mind, meta-analysts can then make a choice about how to implement the side-splitting method for their analysis. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. A clinimetric analysis of the Hopkins Symptom Checklist (SCL-90-R) in general population studies (Denmark, Norway, and Italy)

    DEFF Research Database (Denmark)

    Carrozzino, Danilo; Vassend, Olav; Bjørndal, Flemming

    2016-01-01

    the factor structure. The scalability of the traditional SCL-90-R subscales (somatization, hostility, and interpersonal sensitivity) as well as the affective subscales (depression and anxiety and ADHD) were tested by Mokken’s item response theory model. Results: Across the three general population studies...... the traditional scaled SCL-90-R factor including 83 items was identified by PCA. The Mokken analysis accepted the scalability of both the general factor and the clinical SCL-90-R subscales under examination. Conclusion: The traditional, scaled, general 83 item SCL-90-R scale is a valid measure of general...

  5. A Flexible Nonlinear Modelling Framework for Nonstationary Generalized Extreme Value Analysis in Hydrology and Climatology

    Science.gov (United States)

    Cannon, A. J.

    2009-12-01

    Parameters in a Generalized Extreme Value (GEV) distribution are specified as a function of covariates using a conditional density network (CDN), which is a probabilistic extension of the multilayer perceptron neural network. If the covariate is time, or is dependent on time, then the GEV-CDN model can be used to perform nonlinear, nonstationary GEV analysis of hydrological or climatological time series. Due to the flexibility of the neural network architecture, the model is capable of representing a wide range of nonstationary relationships. Model parameters are estimated by generalized maximum likelihood, an approach that is tailored to the estimation of GEV parameters from geophysical time series. Model complexity is identified using the Bayesian information criterion and the Akaike information criterion with small sample size correction. Monte Carlo simulations are used to validate GEV-CDN performance on four simple synthetic problems. The model is then demonstrated on precipitation data from southern California, a series that exhibits nonstationarity due to interannual/interdecadal climatic variability. A hierarchy of models can be defined by adjusting three aspects of the GEV-CDN model architecture: (i) by specifying either a linear or a nonlinear hidden-layer activation function; (ii) by adjusting the number of hidden-layer nodes; or (iii) by disconnecting weights leading to output-layer nodes. To illustrate, five GEV-CDN models are shown here in order of increasing complexity for the case of a single covariate, which, in this case, is assumed to be time. The shape parameter is assumed to be constant in all models, although this is not a requirement of the GEV-CDN framework.

  6. General and specialized brain correlates for analogical reasoning: A meta-analysis of functional imaging studies.

    Science.gov (United States)

    Hobeika, Lucie; Diard-Detoeuf, Capucine; Garcin, Béatrice; Levy, Richard; Volle, Emmanuelle

    2016-05-01

    Reasoning by analogy allows us to link distinct domains of knowledge and to transfer solutions from one domain to another. Analogical reasoning has been studied using various tasks that have generally required the consideration of the relationships between objects and their integration to infer an analogy schema. However, these tasks varied in terms of the level and the nature of the relationships to consider (e.g., semantic, visuospatial). The aim of this study was to identify the cerebral network involved in analogical reasoning and its specialization based on the domains of information and task specificity. We conducted a coordinate-based meta-analysis of 27 experiments that used analogical reasoning tasks. The left rostrolateral prefrontal cortex was one of the regions most consistently activated across the studies. A comparison between semantic and visuospatial analogy tasks showed both domain-oriented regions in the inferior and middle frontal gyri and a domain-general region, the left rostrolateral prefrontal cortex, which was specialized for analogy tasks. A comparison of visuospatial analogy to matrix problem tasks revealed that these two relational reasoning tasks engage, at least in part, distinct right and left cerebral networks, particularly separate areas within the left rostrolateral prefrontal cortex. These findings highlight several cognitive and cerebral differences between relational reasoning tasks that can allow us to make predictions about the respective roles of distinct brain regions or networks. These results also provide new, testable anatomical hypotheses about reasoning disorders that are induced by brain damage. Hum Brain Mapp 37:1953-1969, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. Composite and Cascaded Generalized-K Fading Channel Modeling and Their Diversity and Performance Analysis

    KAUST Repository

    Ansari, Imran Shafique

    2010-12-01

    The introduction of new schemes that are based on the communication among nodes has motivated the use of composite fading models due to the fact that the nodes experience different multipath fading and shadowing statistics, which subsequently determines the required statistics for the performance analysis of different transceivers. The end-to-end signal-to-noise-ratio (SNR) statistics plays an essential role in the determination of the performance of cascaded digital communication systems. In this thesis, a closed-form expression for the probability density function (PDF) of the end-end SNR for independent but not necessarily identically distributed (i.n.i.d.) cascaded generalized-K (GK) composite fading channels is derived. The developed PDF expression in terms of the Meijer-G function allows the derivation of subsequent performance metrics, applicable to different modulation schemes, including outage probability, bit error rate for coherent as well as non-coherent systems, and average channel capacity that provides insights into the performance of a digital communication system operating in N cascaded GK composite fading environment. Another line of research that was motivated by the introduction of composite fading channels is the error performance. Error performance is one of the main performance measures and derivation of its closed-form expression has proved to be quite involved for certain systems. Hence, in this thesis, a unified closed-form expression, applicable to different binary modulation schemes, for the bit error rate of dual-branch selection diversity based systems undergoing i.n.i.d. GK fading is derived in terms of the extended generalized bivariate Meijer G-function.

  8. The mineral sector and economic development in Ghana: A computable general equilibrium analysis

    Science.gov (United States)

    Addy, Samuel N.

    A computable general equilibrium model (CGE) model is formulated for conducting mineral policy analysis in the context of national economic development for Ghana. The model, called GHANAMIN, places strong emphasis on production, trade, and investment. It can be used to examine both micro and macro economic impacts of policies associated with mineral investment, taxation, and terms of trade changes, as well as mineral sector performance impacts due to technological change or the discovery of new deposits. Its economywide structure enables the study of broader development policy with a focus on individual or multiple sectors, simultaneously. After going through a period of contraction for about two decades, mining in Ghana has rebounded significantly and is currently the main foreign exchange earner. Gold alone contributed 44.7 percent of 1994 total export earnings. GHANAMIN is used to investigate the economywide impacts of mineral tax policies, world market mineral prices changes, mining investment, and increased mineral exports. It is also used for identifying key sectors for economic development. Various simulations were undertaken with the following results: Recently implemented mineral tax policies are welfare increasing, but have an accompanying decrease in the output of other export sectors. World mineral price rises stimulate an increase in real GDP; however, this increase is less than real GDP decreases associated with price declines. Investment in the non-gold mining sector increases real GDP more than investment in gold mining, because of the former's stronger linkages to the rest of the economy. Increased mineral exports are very beneficial to the overall economy. Foreign direct investment (FDI) in mining increases welfare more so than domestic capital, which is very limited. Mining investment and the increased mineral exports since 1986 have contributed significantly to the country's economic recovery, with gold mining accounting for 95 percent of the

  9. Latent class analysis of comorbidity patterns among women with generalized and localized vulvodynia: preliminary findings

    Directory of Open Access Journals (Sweden)

    Nguyen RHN

    2013-04-01

    Full Text Available Ruby HN Nguyen,1 Christin Veasley,2 Derek Smolenski1,3 1Division of Epidemiology and Community Health, School of Public Health, University of Minnesota, Minneapolis, MN, 2National Vulvodynia Association, Silver Spring, MD, 3National Center for Telehealth and Technology, Defense Centers of Excellence, Department of Defense, Tacoma, WA, USA Background: The pattern and extent of clustering of comorbid pain conditions with vulvodynia is largely unknown. However, elucidating such patterns may improve our understanding of the underlying mechanisms involved in these common causes of chronic pain. We sought to describe the pattern of comorbid pain clustering in a population-based sample of women with diagnosed vulvodynia. Methods: A total of 1457 women with diagnosed vulvodynia self-reported their type of vulvar pain as localized, generalized, or both. Respondents were also surveyed about the presence of comorbid pain conditions, including temporomandibular joint and muscle disorders, interstitial cystitis, fibromyalgia, chronic fatigue syndrome, irritable bowel syndrome, endometriosis, and chronic headache. Age-adjusted latent class analysis modeled extant patterns of comorbidity by vulvar pain type, and a multigroup model was used to test for the equality of comorbidity patterns using a comparison of prevalence. A two-class model (no/single comorbidity versus multiple comorbidities had the best fit in individual and multigroup models. Results: For the no/single comorbidity class, the posterior probability prevalence of item endorsement ranged from 0.9% to 24.4%, indicating a low probability of presence. Conversely, the multiple comorbidity class showed that at least two comorbid conditions were likely to be endorsed by at least 50% of women in that class, and irritable bowel syndrome and fibromyalgia were the most common comorbidities regardless of type of vulvar pain. Prevalence of the multiple comorbidity class differed by type of vulvar pain: both

  10. Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS

    National Research Council Canada - National Science Library

    Wiegmann, Douglas; Faaborg, Troy; Boquet, Albert; Detwiler, Cristy; Holcomb, Kali; Shappell, Scott

    2005-01-01

    ... of both commercial and general aviation (GA) accidents. These analyses have helped to identify general trends in the types of human factors issues and aircrew errors that have contributed to civil aviation accidents...

  11. On the stability analysis of a general discrete-time population model involving predation and Allee effects

    International Nuclear Information System (INIS)

    Merdan, H.; Duman, O.

    2009-01-01

    This paper presents the stability analysis of equilibrium points of a general discrete-time population dynamics involving predation with and without Allee effects which occur at low population density. The mathematical analysis and numerical simulations show that the Allee effect has a stabilizing role on the local stability of the positive equilibrium points of this model.

  12. [Schizotypal Personality Questionnaire-Brief - Likert format: Factor structure analysis in general population in France].

    Science.gov (United States)

    Ferchiou, A; Todorov, L; Lajnef, M; Baudin, G; Pignon, B; Richard, J-R; Leboyer, M; Szöke, A; Schürhoff, F

    2017-12-01

    The main objective of the study was to explore the factorial structure of the French version of the Schizotypal Personality Questionnaire-Brief (SPQ-B) in a Likert format, in a representative sample of the general population. In addition, differences in the dimensional scores of schizotypy according to gender and age were analyzed. As the study in the general population of schizotypal traits and its determinants has been recently proposed as a way toward the understanding of aetiology and pathophysiology of schizophrenia, consistent self-report tools are crucial to measure psychometric schizotypy. A shorter version of the widely used Schizotypal Personality Questionnaire (SPQ-Brief) has been extensively investigated in different countries, particularly in samples of students or clinical adolescents, and more recently, a few studies used a Likert-type scale format which allows partial endorsement of items and reduces the risk of defensive answers. A sample of 233 subjects representative of the adult population from an urban area near Paris (Créteil) was recruited using the "itinerary method". They completed the French version of the SPQ-B with a 5-point Likert-type response format (1=completely disagree; 5=completely agree). We examined the dimensional structure of the French version of the SPQ-B with a Principal Components Analysis (PCA) followed by a promax rotation. Factor selection was based on Eigenvalues over 1.0 (Kaiser's criterion), Cattell's Scree-plot test, and interpretability of the factors. Items with loadings greater than 0.4 were retained for each dimension. The internal consistency estimate of the dimensions was calculated with Cronbach's α. In order to study the influence of age and gender, we carried out a simple linear regression with the subscales as dependent variables. Our sample was composed of 131 women (mean age=52.5±18.2 years) and 102 men (mean age=53±18.1 years). SPQ-B Likert total scores ranged from 22 to 84 points (mean=43.6

  13. Application of generalized perturbation theory to sensitivity analysis in boron neutron capture therapy

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Vanessa S. [Universidade Federal Fluminense (EEIMVR/UFF-RJ), Volta Redonda, RJ (Brazil). Escola de Engenharia Industrial e Metalurgica. Programa de Pos-Graduacao em Modelagem Computacional em Ciencia e Tecnologia; Silva, Fernando C.; Silva, Ademir X., E-mail: fernando@con.ufrj.b, E-mail: ademir@con.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Alvarez, Gustavo B. [Universidade Federal Fluminense (EEIMVR/UFF-RJ), Volta Redonda, RJ (Brazil). Escola de Engenharia Industrial e Metalurgica. Dept. de Ciencias Exatas

    2011-07-01

    Boron neutron capture therapy - BNCT - is a binary cancer treatment used in brain tumors. The tumor is loaded with a boron compound and subsequently irradiated by thermal neutrons. The therapy is based on the {sup 10}B (n, {alpha}) {sup 7}Li nuclear reaction, which emits two types of high-energy particles, {alpha} particle and the {sup 7}Li nuclei. The total kinetic energy released in this nuclear reaction, when deposited in the tumor region, destroys the cancer cells. Since the success of the BNCT is linked to the different selectivity between the tumor and healthy tissue, it is necessary to carry out a sensitivity analysis to determinate the boron concentration. Computational simulations are very important in this context because they help in the treatment planning by calculating the lowest effective absorbed dose rate to reduce the damage to healthy tissue. The objective of this paper is to present a deterministic method based on generalized perturbation theory (GPT) to perform sensitivity analysis with respect to the {sup 10}B concentration and to estimate the absorbed dose rate by patients undergoing this therapy. The advantage of the method is a significant reduction in computational time required to perform these calculations. To simulate the neutron flux in all brain regions, the method relies on a two-dimensional neutron transport equation whose spatial, angular and energy variables are discretized by the diamond difference method, the discrete ordinate method and multigroup formulation, respectively. The results obtained through GPT are consistent with those obtained using other methods, demonstrating the efficacy of the proposed method. (author)

  14. Application of generalized perturbation theory to sensitivity analysis in boron neutron capture therapy

    International Nuclear Information System (INIS)

    Garcia, Vanessa S.; Silva, Fernando C.; Silva, Ademir X.; Alvarez, Gustavo B.

    2011-01-01

    Boron neutron capture therapy - BNCT - is a binary cancer treatment used in brain tumors. The tumor is loaded with a boron compound and subsequently irradiated by thermal neutrons. The therapy is based on the 10 B (n, α) 7 Li nuclear reaction, which emits two types of high-energy particles, α particle and the 7 Li nuclei. The total kinetic energy released in this nuclear reaction, when deposited in the tumor region, destroys the cancer cells. Since the success of the BNCT is linked to the different selectivity between the tumor and healthy tissue, it is necessary to carry out a sensitivity analysis to determinate the boron concentration. Computational simulations are very important in this context because they help in the treatment planning by calculating the lowest effective absorbed dose rate to reduce the damage to healthy tissue. The objective of this paper is to present a deterministic method based on generalized perturbation theory (GPT) to perform sensitivity analysis with respect to the 10 B concentration and to estimate the absorbed dose rate by patients undergoing this therapy. The advantage of the method is a significant reduction in computational time required to perform these calculations. To simulate the neutron flux in all brain regions, the method relies on a two-dimensional neutron transport equation whose spatial, angular and energy variables are discretized by the diamond difference method, the discrete ordinate method and multigroup formulation, respectively. The results obtained through GPT are consistent with those obtained using other methods, demonstrating the efficacy of the proposed method. (author)

  15. Multiscale Static Analysis of Notched and Unnotched Laminates Using the Generalized Method of Cells

    Science.gov (United States)

    Naghipour Ghezeljeh, Paria; Arnold, Steven M.; Pineda, Evan J.; Stier, Bertram; Hansen, Lucas; Bednarcyk, Brett A.; Waas, Anthony M.

    2016-01-01

    The generalized method of cells (GMC) is demonstrated to be a viable micromechanics tool for predicting the deformation and failure response of laminated composites, with and without notches, subjected to tensile and compressive static loading. Given the axial [0], transverse [90], and shear [+45/-45] response of a carbon/epoxy (IM7/977-3) system, the unnotched and notched behavior of three multidirectional layups (Layup 1: [0,45,90,-45](sub 2S), Layup 2: [0,60,0](sub 3S), and Layup 3: [30,60,90,-30, -60](sub 2S)) are predicted under both tensile and compressive static loading. Matrix nonlinearity is modeled in two ways. The first assumes all nonlinearity is due to anisotropic progressive damage of the matrix only, which is modeled, using the multiaxial mixed-mode continuum damage model (MMCDM) within GMC. The second utilizes matrix plasticity coupled with brittle final failure based on the maximum principle strain criteria to account for matrix nonlinearity and failure within the Finite Element Analysis--Micromechanics Analysis Code (FEAMAC) software multiscale framework. Both MMCDM and plasticity models incorporate brittle strain- and stress-based failure criteria for the fiber. Upon satisfaction of these criteria, the fiber properties are immediately reduced to a nominal value. The constitutive response for each constituent (fiber and matrix) is characterized using a combination of vendor data and the axial, transverse, and shear responses of unnotched laminates. Then, the capability of the multiscale methodology is assessed by performing blind predictions of the mentioned notched and unnotched composite laminates response under tensile and compressive loading. Tabulated data along with the detailed results (i.e., stress-strain curves as well as damage evolution states at various ratios of strain to failure) for all laminates are presented.

  16. Statistical analysis of latent generalized correlation matrix estimation in transelliptical distribution.

    Science.gov (United States)

    Han, Fang; Liu, Han

    2017-02-01

    Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.

  17. Generalized moment analysis of magnetic field correlations for accumulations of spherical and cylindrical magnetic pertubers

    Directory of Open Access Journals (Sweden)

    Felix Tobias Kurz

    2016-12-01

    Full Text Available In biological tissue, an accumulation of similarly shaped objects with a susceptibility difference to the surrounding tissue generates a local distortion of the external magnetic field in magnetic resonance imaging. It induces stochastic field fluctuations that characteristically influence proton spin diffusion in the vicinity of these magnetic perturbers. The magnetic field correlation that is associated with such local magnetic field inhomogeneities can be expressed in the form of a dynamic frequency autocorrelation function that is related to the time evolution of the measured magnetization. Here, an eigenfunction expansion for two simple magnetic perturber shapes, that of spheres and cylinders, is considered for restricted spin diffusion in a simple model geometry. Then, the concept of generalized moment analysis, an approximation technique that is applied in the study of (non-reactive processes that involve Brownian motion, allows to provide analytical expressions for the correlation function for different exponential decay forms. Results for the biexponential decay for both spherical and cylindrical magnetized objects are derived and compared with the frequently used (less accurate monoexponential decay forms. They are in asymptotic agreement with the numerically exact value of the correlation function for long and short times.

  18. Analysis of the General Electric Company swell tests with RELAP4/MOD7

    International Nuclear Information System (INIS)

    Fischer, S.R.; Hendrix, C.E.

    1979-01-01

    The RELAP4/MOD7 nuclear reactor transient analysis code, presently being developed by EG and G Idaho, Inc., will incorporate several significant improvements over earlier versions of RELAP4. As part of the development of RELAP4/MOD7, a thorough assessment of the capability of the code to simulate water reactor LOCA phenomena is being made. This assessment is accomplished in part by comparing results from code calculations with test data from experimental facilities. Simulations of the General Electric Company (GE) level swell tests were performed as part of the code checkout. In these tests, a pressurized vessel partially filled with nearly saturated water was blown down through a simulated break located near the top of the vessel. Comparison of RELAP4 calculations with data from these experiments indicates that the code has the capability to model the unequal phase velocity flow and resulting density gradients that might occur in a BWR steam line break transient. Comparisons of RELAP4 calculations with data from two level swell experiments are presented

  19. An analysis of the Rayleigh–Stokes problem for a generalized second-grade fluid

    KAUST Repository

    Bazhlekova, Emilia

    2014-11-26

    © 2014, The Author(s). We study the Rayleigh–Stokes problem for a generalized second-grade fluid which involves a Riemann–Liouville fractional derivative in time, and present an analysis of the problem in the continuous, space semidiscrete and fully discrete formulations. We establish the Sobolev regularity of the homogeneous problem for both smooth and nonsmooth initial data v, including v∈L2(Ω). A space semidiscrete Galerkin scheme using continuous piecewise linear finite elements is developed, and optimal with respect to initial data regularity error estimates for the finite element approximations are derived. Further, two fully discrete schemes based on the backward Euler method and second-order backward difference method and the related convolution quadrature are developed, and optimal error estimates are derived for the fully discrete approximations for both smooth and nonsmooth initial data. Numerical results for one- and two-dimensional examples with smooth and nonsmooth initial data are presented to illustrate the efficiency of the method, and to verify the convergence theory.

  20. An analysis of the Rayleigh–Stokes problem for a generalized second-grade fluid

    KAUST Repository

    Bazhlekova, Emilia; Jin, Bangti; Lazarov, Raytcho; Zhou, Zhi

    2014-01-01

    © 2014, The Author(s). We study the Rayleigh–Stokes problem for a generalized second-grade fluid which involves a Riemann–Liouville fractional derivative in time, and present an analysis of the problem in the continuous, space semidiscrete and fully discrete formulations. We establish the Sobolev regularity of the homogeneous problem for both smooth and nonsmooth initial data v, including v∈L2(Ω). A space semidiscrete Galerkin scheme using continuous piecewise linear finite elements is developed, and optimal with respect to initial data regularity error estimates for the finite element approximations are derived. Further, two fully discrete schemes based on the backward Euler method and second-order backward difference method and the related convolution quadrature are developed, and optimal error estimates are derived for the fully discrete approximations for both smooth and nonsmooth initial data. Numerical results for one- and two-dimensional examples with smooth and nonsmooth initial data are presented to illustrate the efficiency of the method, and to verify the convergence theory.

  1. Dark matter annihilations into two light fermions and one gauge boson. General analysis and antiproton constraints

    International Nuclear Information System (INIS)

    Garny, Mathias; Ibarra, Alejandro; Vogl, Stefan

    2011-12-01

    We study in this paper the scenario where the dark matter is constituted by Majo- rana particles which couple to a light Standard Model fermion and an extra scalar via a Yukawa coupling. In this scenario, the annihilation rate into the light fermions with the mediation of the scalar particle is strongly suppressed by the mass of the fermion. Nevertheless, the helicity suppression is lifted by the associated emission of a gauge boson, yielding annihilation rates which could be large enough to allow the indirect detection of the dark matter particles. We perform a general analysis of this scenario, calculating the annihilation cross section of the processes χχ → f anti fV when the dark matter particle is a SU(2) L singlet or doublet, f is a lepton or a quark, and V is a photon, a weak gauge boson or a gluon. We point out that the annihilation rate is particularly enhanced when the dark matter particle is degenerate in mass to the intermediate scalar particle, which is a scenario barely constrained by collider searches of exotic charged or colored particles. Lastly, we derive upper limits on the relevant cross sections from the non-observation of an excess in the cosmic antiproton-to-proton ratio measured by PAMELA. (orig.)

  2. Reliability analysis of a phaser measurement unit using a generalized fuzzy lambda-tau(GFLT) technique.

    Science.gov (United States)

    Komal

    2018-05-01

    Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Oblique rotaton in canonical correlation analysis reformulated as maximizing the generalized coefficient of determination.

    Science.gov (United States)

    Satomura, Hironori; Adachi, Kohei

    2013-07-01

    To facilitate the interpretation of canonical correlation analysis (CCA) solutions, procedures have been proposed in which CCA solutions are orthogonally rotated to a simple structure. In this paper, we consider oblique rotation for CCA to provide solutions that are much easier to interpret, though only orthogonal rotation is allowed in the existing formulations of CCA. Our task is thus to reformulate CCA so that its solutions have the freedom of oblique rotation. Such a task can be achieved using Yanai's (Jpn. J. Behaviormetrics 1:46-54, 1974; J. Jpn. Stat. Soc. 11:43-53, 1981) generalized coefficient of determination for the objective function to be maximized in CCA. The resulting solutions are proved to include the existing orthogonal ones as special cases and to be rotated obliquely without affecting the objective function value, where ten Berge's (Psychometrika 48:519-523, 1983) theorems on suborthonormal matrices are used. A real data example demonstrates that the proposed oblique rotation can provide simple, easily interpreted CCA solutions.

  4. Dark matter annihilations into two light fermions and one gauge boson. General analysis and antiproton constraints

    Energy Technology Data Exchange (ETDEWEB)

    Garny, Mathias [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Ibarra, Alejandro; Vogl, Stefan [Technische Univ. Muenchen, Garching (Germany). Physik-Department

    2011-12-15

    We study in this paper the scenario where the dark matter is constituted by Majo- rana particles which couple to a light Standard Model fermion and an extra scalar via a Yukawa coupling. In this scenario, the annihilation rate into the light fermions with the mediation of the scalar particle is strongly suppressed by the mass of the fermion. Nevertheless, the helicity suppression is lifted by the associated emission of a gauge boson, yielding annihilation rates which could be large enough to allow the indirect detection of the dark matter particles. We perform a general analysis of this scenario, calculating the annihilation cross section of the processes {chi}{chi} {yields} f anti fV when the dark matter particle is a SU(2){sub L} singlet or doublet, f is a lepton or a quark, and V is a photon, a weak gauge boson or a gluon. We point out that the annihilation rate is particularly enhanced when the dark matter particle is degenerate in mass to the intermediate scalar particle, which is a scenario barely constrained by collider searches of exotic charged or colored particles. Lastly, we derive upper limits on the relevant cross sections from the non-observation of an excess in the cosmic antiproton-to-proton ratio measured by PAMELA. (orig.)

  5. Analysis on difference of risk perception between people engaged in nuclear business and general public

    International Nuclear Information System (INIS)

    Terado, M.; Yoshikawa, H.; Sugiman, T.; Hibino, A.; Akimoto, M.

    2004-01-01

    A new research project has started to develop two kinds of on-the-web communication systems which are aimed at effective social risk information on nuclear energy. One is mutual communication system for fostering safety culture among the workers in nuclear industry while the other is to enlighten general public about the risk issues on final disposal of high-level radioactive waste. Prior to the on-the-web systems development, social investigations have been conducted on risk perception for nuclear power for both the nuclear experts and women in the metropolitan area, in order to know how and what should be considered for the effective risk communication methods. It was found from the statistical analysis of the results of social investigation that the majority of nuclear people take business risk seriously but there is a fraction of nuclear people who are afraid of present practice of nuclear power operation while women in metropolitan area are evenly afraid of radioactive risk. The obtained results of social investigation gave useful insight for developing two kinds of risk communication systems and the related field study for enhancing safety culture in nuclear industries. (authors)

  6. Diagnostics for generalized linear hierarchical models in network meta-analysis.

    Science.gov (United States)

    Zhao, Hong; Hodges, James S; Carlin, Bradley P

    2017-09-01

    Network meta-analysis (NMA) combines direct and indirect evidence comparing more than 2 treatments. Inconsistency arises when these 2 information sources differ. Previous work focuses on inconsistency detection, but little has been done on how to proceed after identifying inconsistency. The key issue is whether inconsistency changes an NMA's substantive conclusions. In this paper, we examine such discrepancies from a diagnostic point of view. Our methods seek to detect influential and outlying observations in NMA at a trial-by-arm level. These observations may have a large effect on the parameter estimates in NMA, or they may deviate markedly from other observations. We develop formal diagnostics for a Bayesian hierarchical model to check the effect of deleting any observation. Diagnostics are specified for generalized linear hierarchical NMA models and investigated for both published and simulated datasets. Results from our example dataset using either contrast- or arm-based models and from the simulated datasets indicate that the sources of inconsistency in NMA tend not to be influential, though results from the example dataset suggest that they are likely to be outliers. This mimics a familiar result from linear model theory, in which outliers with low leverage are not influential. Future extensions include incorporating baseline covariates and individual-level patient data. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Dynamical analysis of bounded and unbounded orbits in a generalized Hénon-Heiles system

    Science.gov (United States)

    Dubeibe, F. L.; Riaño-Doncel, A.; Zotos, Euaggelos E.

    2018-04-01

    The Hénon-Heiles potential was first proposed as a simplified version of the gravitational potential experimented by a star in the presence of a galactic center. Currently, this system is considered a paradigm in dynamical systems because despite its simplicity exhibits a very complex dynamical behavior. In the present paper, we perform a series expansion up to the fifth-order of a potential with axial and reflection symmetries, which after some transformations, leads to a generalized Hénon-Heiles potential. Such new system is analyzed qualitatively in both regimes of bounded and unbounded motion via the Poincaré sections method and plotting the exit basins. On the other hand, the quantitative analysis is performed through the Lyapunov exponents and the basin entropy, respectively. We find that in both regimes the chaoticity of the system decreases as long as the test particle energy gets far from the critical energy. Additionally, we may conclude that despite the inclusion of higher order terms in the series expansion, the new system shows wider zones of regularity (islands) than the ones present in the Hénon-Heiles system.

  8. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    Hill, M.

    2001-01-01

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO 2 ) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO 2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  9. Open access to MRI for general practitioners: 12 years' experience at one institution -- a retrospective analysis.

    Science.gov (United States)

    Gough-Palmer, A L; Burnett, C; Gedroyc, W M

    2009-08-01

    The aim of this study was to evaluate 12 years of general practitioner (GP) use of open access MRI services at a single London teaching hospital. A retrospective analysis of reports from all GP requests for MRI scans between 1994 and 2005 was performed. The date, scanned body part, and requester details from 1798 scans requested by 209 individual GPs over a continuous 12-year period were recorded. All scans were then graded into four categories based on the severity of reported findings from normal to gross abnormality. Over the study period, GP requests as a percentage of the total (MRI) department workload remained low at approximately 2.6%. Spine, knee and brain requests constituted 86% (n = 1546) of requested scans. 48% (n = 868) of scans were reported as normal or minor degenerative changes only. 26% (n = 466) of scans demonstrated serious pathology that was likely to warrant hospital consultant referral. There was a wide range of scans requested per requester, from 1 to 240 over the period, with an average of 8.5 scans per GP. In conclusion, any department wishing to set up open access to MRI services for GPs could cover the majority of requests by offering spine, knee and brain imaging. The percentage of normal report rates for GP requests is comparable with previous studies of outpatient referrals. A large variation in requesting patterns between GPs suggests the need for increased communication between GPs and imaging departments to optimise use of the service.

  10. Impacts Of External Price Shocks On Malaysian Macro Economy-An Applied General Equilibrium Analysis

    Directory of Open Access Journals (Sweden)

    Abul Quasem Al-Amin

    2008-10-01

    Full Text Available This paper examines the impacts of external price shocks in the Malaysian economy. There are three simulations are carried out with different degrees of external shocks using Malaysian Social Accounting Matrix (SAM and Computable General Equilibrium (CGE analysis. The model results indicate that the import price shocks, better known as external price shocks by 15% decreases the domestic production of building and construction sector by 25.87%, hotels, restaurants and entertainment sector by 12.04%, industry sector by 12.02%, agriculture sector by 11.01%, and electricity and gas sector by 9.55% from the baseline. On the import side, our simulation results illustrate that as a result of the import price shocks by 15%, imports decreases significantly in all sectors from base level. Among the scenarios, the largest negative impacts goes on industry sectors by 29.67% followed by building and construction sector by 22.42%, hotels, restaurants and entertainment sector by 19.45%, electricity and gas sector by 13.%, agriculture sector by 12.63% and other service sectors by 11.17%. However significant negative impact goes to the investment and fixed capital investment. It also causes the household income, household consumption and household savings down and increases the cost of livings in the economy results in downward social welfare.

  11. The 2013 general elections in Malaysia: An analysis of online news portals

    Directory of Open Access Journals (Sweden)

    Azahar Kasim

    2016-05-01

    Full Text Available This research analyzed the coverage of online news portals during the election campaign in Malaysia's 13th General Election on 5th May 2013. There were two types of news portals chosen for this research: 1 the mainstream online news portals, namely The Star Online, Berita Harian Online, Bernama Online and Utusan Online; and 2 the alternative news portals consisting of political parties' publications: the Harakah Daily, Roketkini and Keadilan Daily; and the independent news portals of The Malaysian Insider and Malaysiakini. This study was conducted starting from the nomination day on the 20th April 2013 until the polling day on the 5th May 2013. Results obtained were based on the frequencies of articles covering the ruling Barisan Nasional (BN party and the opposition Pakatan Rakyat (PR party. Each article was coded and labeled as positive, negative, or neutral coverage for each political party. The Content Analysis method was applied where the researchers chose and analyzed each election article and placed it in one of five categories; +BN (positive report, −BN (negative report, +PR (positive report, −PR (negative report and N (Neutral. The results showed that the four mainstream online news portals favored the BN with their coverage. However, the parties' online news portals clearly owned by PR alliance parties had completely opposite, bias toward their owners. The two independent news portals seemed to give more balanced coverage to both sides.

  12. A spatial analysis of the expanding roles of nurses in general practice

    Directory of Open Access Journals (Sweden)

    Pearce Christopher

    2012-08-01

    Full Text Available Abstract Background Changes to the workforce and organisation of general practice are occurring rapidly in response to the Australian health care reform agenda, and the changing nature of the medical profession. In particular, the last five years has seen the rapid introduction and expansion of a nursing workforce in Australian general practices. This potentially creates pressures on current infrastructure in general practice. Method This study used a mixed methods, ‘rapid appraisal’ approach involving observation, photographs, and interviews. Results Nurses utilise space differently to GPs, and this is part of the diversity they bring to the general practice environment. At the same time their roles are partly shaped by the ways space is constructed in general practices. Conclusion The fluidity of nursing roles in general practice suggests that nurses require a versatile space in which to maximize their role and contribution to the general practice team.

  13. ANALYSIS OF MODELS OF EARLY DEBT REPAYMENT IN THE Generalized CREDIT TRANSACTIONS

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available This paper analyzes the patterns of early repayment in multi-period credit transactions. Considered one of the most common ways of conversion of unpaid interest for early repayment, so-called 78 rule. The relationship of this rule with the linear approximation of the exact value; redeemable debt is determined. The analysis of the maximum excess payment of interest on 78 rule. It has been shown how interest payment on 78 rule depended on the time of early repayment. Early repayment of debt is an agreement under which the borrower pays to the lender amount of money equal to the current balance (as of loan account. Then further regular payments cease and the contract terminates. However, the amount of outstanding debt is determined by the structure of prescription charges. So in the uniform schemes of repayment of consumer credit each payment contains the same part of principal amounts and the total interest. In case of early repayment the Bank loses a significant fraction of the expected interest payments. Therefore, in practice, often used so-called accelerated schemes of interest payments. One of them is 78 rule. Use the 78 rule is simple and straightforward. The name of the rule is due to the fact that the sum of the numbers 12 monthly payments is 78. In the schemes of consumer loan with a term of one year interest payment for the current month is equal to m/78 of the total amount of interest payments, where m is the number of remaining payments. The rule name is stored and in the more general case with an arbitrary number of payments. In general interest payment is determined by the relative weight of the total amount of interest in each payment. In uniform schemes it is constant. In accelerated with a particular speed decreases. Therefore, additional cash expenses by the 78 rule may be considered as additional penalties for early repayment of the debt. It this article is shown how this penalty depends on time before maturity. It is shown that

  14. A Generalized Orthotropic Elasto-Plastic Material Model for Impact Analysis

    Science.gov (United States)

    Hoffarth, Canio

    Composite materials are now beginning to provide uses hitherto reserved for metals in structural systems such as airframes and engine containment systems, wraps for repair and rehabilitation, and ballistic/blast mitigation systems. These structural systems are often subjected to impact loads and there is a pressing need for accurate prediction of deformation, damage and failure. There are numerous material models that have been developed to analyze the dynamic impact response of polymer matrix composites. However, there are key features that are missing in those models that prevent them from providing accurate predictive capabilities. In this dissertation, a general purpose orthotropic elasto-plastic computational constitutive material model has been developed to predict the response of composites subjected to high velocity impacts. The constitutive model is divided into three components - deformation model, damage model and failure model, with failure to be added at a later date. The deformation model generalizes the Tsai-Wu failure criteria and extends it using a strain-hardening-based orthotropic yield function with a non-associative flow rule. A strain equivalent formulation is utilized in the damage model that permits plastic and damage calculations to be uncoupled and capture the nonlinear unloading and local softening of the stress-strain response. A diagonal damage tensor is defined to account for the directionally dependent variation of damage. However, in composites it has been found that loading in one direction can lead to damage in multiple coordinate directions. To account for this phenomena, the terms in the damage matrix are semi-coupled such that the damage in a particular coordinate direction is a function of the stresses and plastic strains in all of the coordinate directions. The overall framework is driven by experimental tabulated temperature and rate-dependent stress-strain data as well as data that characterizes the damage matrix and failure

  15. Seroprevalence of Toxoplasma gondii in the Iranian general population: a systematic review and meta-analysis.

    Science.gov (United States)

    Daryani, Ahmad; Sarvi, Shahabeddin; Aarabi, Mohsen; Mizani, Azadeh; Ahmadpour, Ehsan; Shokri, Azar; Rahimi, Mohammad-Taghi; Sharif, Mehdi

    2014-09-01

    Toxoplasma gondii is one of the most common protozoan parasites with widespread distribution globally. It is the causative agent of Toxoplasma infection, which is prevalent in human and other warm-blooded vertebrates. While T. gondii infection in healthy people is usually asymptomatic, it can lead to serious pathological effects in congenital cases and immunodeficient patients. We sought to identify the seroprevalence rate of Toxoplasma infection in the Iranian general population to develop a comprehensive description of the disease condition in Iran for future use. Electronic databases (PubMed, Google Scholar, Science Direct, and Scopus) and Persian language databases (Magiran, Scientific Information Database [SID], Iran Medex, and Iran Doc) were searched. Furthermore, graduate student dissertations and proceedings of national parasitology congresses were searched manually. Our search resulted in a total of 35 reports published from 1978 to 2012.These include 22 published articles, 1 unpublished study, 8 proceedings from the Iranian conference of parasitology, and 4 graduate student dissertations, resulting in 52,294 individuals and 23,385 IgG seropositive cases. The random errors method was used for this meta-analysis. The result shows that the overall seroprevalence rate of toxoplasmos is among the general population in Iran was 39.3% (95% CI=33.0%-45.7%). There was no significant difference in the seroprevalence rate between male and female patients. A significant linear trend of increasing overall prevalence by age was noted (P<0.0001). In addition, the data indicates that there are high seroprevalence in groups who have direct contact with cats, consume uncooked meat and raw fruits or vegetables, in farmers and Housewife, individuals who have a low level of education, and live in rural areas. To the best of our knowledge, this is the first systematic review of T. gondii infection seroprevalence in Iran, which shows a high prevalence of Toxoplasma infection

  16. National data analysis of general radiography projection method in medical imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Su; Seo, Deok Nam; Choi, In Seok [Dept. of Bio-Convergence Engineering, Korea University Graduate School, Seoul (Korea, Republic of); and others

    2014-09-15

    According to database of medical institutions of health insurance review and assessment service in 2013, 1,118 hospitals and clinics have department of radiology in Korea. And there are CT, fluoroscopic and general radiographic equipment in those hospitals. Above all, general radiographic equipment is the most commonly used in the radiology department. And most of the general radiographic equipment are changing the digital radiography system from the film-screen types of the radiography system nowadays. However, most of the digital radiography department are used the film-screen types of the radiography system. Therefore, in this study, we confirmed present conditions of technical items for general radiography used in hospital and research on general radiographic techniques in domestic medical institutions. We analyzed 26 radiography projection method including chest, skull, spine and pelvis which are generally used in the radiography department.

  17. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  18. Full Spectrum Operations: An Analysis of Course Content at the Command and General Staff College

    National Research Council Canada - National Science Library

    Turner, II, Frank L

    2008-01-01

    .... This monograph examined the Intermediate Level Education, the Advanced Military Studies Program, and the Tactical Commanders Development Program curricula at the Command and General Staff College...

  19. Stability analysis of black holes via a catastrophe theory and black hole thermodynamics in generalized theories of gravity

    International Nuclear Information System (INIS)

    Tamaki, Takashi; Torii, Takashi; Maeda, Kei-ichi

    2003-01-01

    We perform a linear perturbation analysis for black hole solutions with a 'massive' Yang-Mills field (the Proca field) in Brans-Dicke theory and find that the results are quite consistent with those via catastrophe theory where thermodynamic variables play an intrinsic role. Based on this observation, we show the general relation between these two methods in generalized theories of gravity which are conformally related to the Einstein-Hilbert action

  20. Analysis of time to event outcomes in randomized controlled trials by generalized additive models.

    Directory of Open Access Journals (Sweden)

    Christos Argyropoulos

    Full Text Available Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and

  1. A meta-analysis of the relationship between general mental ability and nontask performance.

    Science.gov (United States)

    Gonzalez-Mulé, Erik; Mount, Michael K; Oh, In-Sue

    2014-11-01

    Although one of the most well-established research findings in industrial-organizational psychology is that general mental ability (GMA) is a strong and generalizable predictor of job performance, this meta-analytically derived conclusion is based largely on measures of task or overall performance. The primary purpose of this study is to address a void in the research literature by conducting a meta-analysis to determine the direction and magnitude of the correlation of GMA with 2 dimensions of nontask performance: counterproductive work behaviors (CWB) and organizational citizenship behaviors (OCB). Overall, the results show that the true-score correlation between GMA and CWB is essentially 0 (-.02, k = 35), although rating source of CWB moderates this relationship. The true-score correlation between GMA and OCB is positive but modest in magnitude (.23, k = 43). The 2nd purpose of this study is to conduct meta-analytic relative weight analyses to determine the relative importance of GMA and the five-factor model (FFM) of personality traits in predicting nontask and task performance criteria. Results indicate that, collectively, the FFM traits are substantially more important for CWB than GMA, that the FFM traits are roughly equal in importance to GMA for OCB, and that GMA is substantially more important for task and overall job performance than the FFM traits. Implications of these findings for the development of optimal selection systems and the development of comprehensive theories of job performance are discussed along with study limitation and future research directions. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  2. Sensitivity Analysis of Oxide Scale Influence on General Carbon Steels during Hot Forging

    Directory of Open Access Journals (Sweden)

    Bernd-Arno Behrens

    2018-02-01

    Full Text Available Increasing product requirements have made numerical simulation into a vital tool for the time- and cost-efficient process design. In order to accurately model hot forging processes with finite, element-based numerical methods, reliable models are required, which take the material behaviour, surface phenomena of die and workpiece, and machine kinematics into account. In hot forging processes, the surface properties are strongly affected by the growth of oxide scale, which influences the material flow, friction, and product quality of the finished component. The influence of different carbon contents on material behaviour is investigated by considering three different steel grades (C15, C45, and C60. For a general description of the material behaviour, an empirical approach is used to implement mathematical functions for expressing the relationship between flow stress and dominant influence variables like alloying elements, initial microstructure, and reheating mode. The deformation behaviour of oxide scale is separately modelled for each component with parameterized flow curves. The main focus of this work lies in the consideration of different materials as well as the calculation and assignment of their material properties in dependence on current process parameters by application of subroutines. The validated model is used to carry out the influence of various oxide scale parameters, like the scale thickness and the composition, on the hot forging process. Therefore, selected parameters have been varied within a numerical sensitivity analysis. The results show a strong influence of oxide scale on the friction behaviour as well as on the material flow during hot forging.

  3. EEG analysis of seizure patterns using visibility graphs for detection of generalized seizures.

    Science.gov (United States)

    Wang, Lei; Long, Xi; Arends, Johan B A M; Aarts, Ronald M

    2017-10-01

    The traditional EEG features in the time and frequency domain show limited seizure detection performance in the epileptic population with intellectual disability (ID). In addition, the influence of EEG seizure patterns on detection performance was less studied. A single-channel EEG signal can be mapped into visibility graphs (VGS), including basic visibility graph (VG), horizontal VG (HVG), and difference VG (DVG). These graphs were used to characterize different EEG seizure patterns. To demonstrate its effectiveness in identifying EEG seizure patterns and detecting generalized seizures, EEG recordings of 615h on one EEG channel from 29 epileptic patients with ID were analyzed. A novel feature set with discriminative power for seizure detection was obtained by using the VGS method. The degree distributions (DDs) of DVG can clearly distinguish EEG of each seizure pattern. The degree entropy and power-law degree power in DVG were proposed here for the first time, and they show significant difference between seizure and non-seizure EEG. The connecting structure measured by HVG can better distinguish seizure EEG from background than those by VG and DVG. A traditional EEG feature set based on frequency analysis was used here as a benchmark feature set. With a support vector machine (SVM) classifier, the seizure detection performance of the benchmark feature set (sensitivity of 24%, FD t /h of 1.8s) can be improved by combining our proposed VGS features extracted from one EEG channel (sensitivity of 38%, FD t /h of 1.4s). The proposed VGS-based features can help improve seizure detection for ID patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Black carbon ageing in the Canadian Centre for Climate modelling and analysis atmospheric general circulation model

    Directory of Open Access Journals (Sweden)

    B. Croft

    2005-01-01

    Full Text Available Black carbon (BC particles in the atmosphere have important impacts on climate. The amount of BC in the atmosphere must be carefully quantified to allow evaluation of the climate effects of this type of aerosol. In this study, we present the treatment of BC aerosol in the developmental version of the 4th generation Canadian Centre for Climate modelling and analysis (CCCma atmospheric general circulation model (AGCM. The focus of this work is on the conversion of insoluble BC to soluble/mixed BC by physical and chemical ageing. Physical processes include the condensation of sulphuric and nitric acid onto the BC aerosol, and coagulation with more soluble aerosols such as sulphates and nitrates. Chemical processes that may age the BC aerosol include the oxidation of organic coatings by ozone. Four separate parameterizations of the ageing process are compared to a control simulation that assumes no ageing occurs. These simulations use 1 an exponential decay with a fixed 24h half-life, 2 a condensation and coagulation scheme, 3 an oxidative scheme, and 4 a linear combination of the latter two ageing treatments. Global BC burdens are 2.15, 0.15, 0.11, 0.21, and 0.11TgC for the control run, and four ageing schemes, respectively. The BC lifetimes are 98.1, 6.6, 5.0, 9.5, and 4.9 days, respectively. The sensitivity of modelled BC burdens, and concentrations to the factor of two uncertainty in the emissions inventory is shown to be greater than the sensitivity to the parameterization used to represent the BC ageing, except for the oxidation based parameterization. A computationally efficient parameterization that represents the processes of condensation, coagulation, and oxidation is shown to simulate BC ageing well in the CCCma AGCM. As opposed to the globally fixed ageing time scale, this treatment of BC ageing is responsive to varying atmospheric composition.

  5. Generalized cost-effectiveness analysis for national-level priority-setting in the health sector

    Directory of Open Access Journals (Sweden)

    Edejer Tessa

    2003-12-01

    Full Text Available Abstract Cost-effectiveness analysis (CEA is potentially an important aid to public health decision-making but, with some notable exceptions, its use and impact at the level of individual countries is limited. A number of potential reasons may account for this, among them technical shortcomings associated with the generation of current economic evidence, political expediency, social preferences and systemic barriers to implementation. As a form of sectoral CEA, Generalized CEA sets out to overcome a number of these barriers to the appropriate use of cost-effectiveness information at the regional and country level. Its application via WHO-CHOICE provides a new economic evidence base, as well as underlying methodological developments, concerning the cost-effectiveness of a range of health interventions for leading causes of, and risk factors for, disease. The estimated sub-regional costs and effects of different interventions provided by WHO-CHOICE can readily be tailored to the specific context of individual countries, for example by adjustment to the quantity and unit prices of intervention inputs (costs or the coverage, efficacy and adherence rates of interventions (effectiveness. The potential usefulness of this information for health policy and planning is in assessing if current intervention strategies represent an efficient use of scarce resources, and which of the potential additional interventions that are not yet implemented, or not implemented fully, should be given priority on the grounds of cost-effectiveness. Health policy-makers and programme managers can use results from WHO-CHOICE as a valuable input into the planning and prioritization of services at national level, as well as a starting point for additional analyses of the trade-off between the efficiency of interventions in producing health and their impact on other key outcomes such as reducing inequalities and improving the health of the poor.

  6. Quantitative analysis by laser-induced breakdown spectroscopy based on generalized curves of growth

    Energy Technology Data Exchange (ETDEWEB)

    Aragón, C., E-mail: carlos.aragon@unavarra.es; Aguilera, J.A.

    2015-08-01

    A method for quantitative elemental analysis by laser-induced breakdown spectroscopy (LIBS) is proposed. The method (Cσ-LIBS) is based on Cσ graphs, generalized curves of growth which allow including several lines of various elements at different concentrations. A so-called homogeneous double (HD) model of the laser-induced plasma is used, defined by an integration over a single-region of the radiative transfer equation, combined with a separated treatment for neutral atoms (z = 0) and singly-charged ions (z = 1) in Cσ graphs and characteristic parameters. The procedure includes a criterion, based on a model limit, for eliminating data which, due to a high line intensity or concentration, are not well described by the HD model. An initial procedure provides a set of parameters (βA){sup z}, (ηNl){sup z}, T{sup z} and N{sub e}{sup z} (z = 0, 1) which characterize the plasma and the LIBS system. After characterization, two different analytical procedures, resulting in relative and absolute concentrations, may be applied. To test the method, fused glass samples prepared from certified slags and pure compounds are analyzed. We determine concentrations of Ca, Mn, Mg, V, Ti, Si and Al relative to Fe in three samples prepared from slags, and absolute concentrations of Fe, Ca and Mn in three samples prepared from Fe{sub 2}O{sub 3}, CaCO{sub 3} and Mn{sub 2}O{sub 3}. The accuracy obtained is 3.2% on the average for relative concentrations and 9.2% for absolute concentrations. - Highlights: • Method for quantitative analysis by LIBS, based on Csigma graphs • Conventional calibration is replaced with characterization of the LIBS system. • All elements are determined from measurement of one or two Csigma graphs. • The method is tested with fused glass disks prepared from slags and pure compounds. • Accurate results for relative (3.2%) and absolute concentrations (9.2%)

  7. The current situation of inorganic elements in marine turtles: A general review and meta-analysis.

    Science.gov (United States)

    Cortés-Gómez, Adriana A; Romero, Diego; Girondot, Marc

    2017-10-01

    Inorganic elements (Pb, Cd, Hg, Al, As, Cr, Cu, Fe, Mn, Ni, Se and Zn) are present globally in aquatic systems and their potential transfer to marine turtles can be a serious threat to their health status. The environmental fate of these contaminants may be traced by the analysis of turtle tissues. Loggerhead turtles (Caretta caretta) are the most frequently investigated of all the sea turtle species with regards to inorganic elements, followed by Green turtles (Chelonia mydas); all the other species have considerably fewer studies. Literature shows that blood, liver, kidney and muscle are the tissues most frequently used for the quantification of inorganic elements, with Pb, Cd, Cu and Zn being the most studied elements. Chelonia mydas showed the highest concentrations of Cr in muscle (4.8 ± 0.12), Cu in liver (37 ± 7) and Mg in kidney (17 μg g -1 ww), Cr and Cu from the Gulf of Mexico and Mg from Japanese coasts; Lepidochelys olivacea presented the highest concentrations of Pb in blood (4.46 5) and Cd in kidney (150 ± 110 μg g -1 ww), both from the Mexican Pacific; Caretta caretta from the Mediterranean Egyptian coast had the highest report of Hg in blood (0.66 ± 0.13 μg g -1 ww); and Eretmochelys imbricata from Japan had the highest concentration of As in muscle (30 ± 13 13 μg g -1 ww). The meta-analysis allows us to examine some features that were not visible when data was analyzed alone. For instance, Leatherbacks show a unique pattern of concentration compared to other species. Additionally, contamination of different tissues shows some tendencies independent of the species with liver and kidney on one side and bone on the other being different from other tissues. This review provides a general perspective on the accumulation and distribution of these inorganic elements alongside existing information for the 7 sea turtle species. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Ideal observer estimation and generalized ROC analysis for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Edwards, Darrin C.

    2004-01-01

    The research presented in this dissertation represents an innovative application of computer-aided diagnosis and signal detection theory to the specific task of early detection of breast cancer in the context of screening mammography. A number of automated schemes have been developed in our laboratory to detect masses and clustered microcalcifications in digitized mammograms, on the one hand, and to classify known lesions as malignant or benign, on the other. The development of fully automated classification schemes is difficult, because the output of a detection scheme will contain false-positive detections in addition to detected malignant and benign lesions, resulting in a three-class classification task. Researchers have so far been unable to extend successful tools for analyzing two-class classification tasks, such as receiver operating characteristic (ROC) analysis, to three-class classification tasks. The goals of our research were to use Bayesian artificial neural networks to estimate ideal observer decision variables to both detect and classify clustered microcalcifications and mass lesions in mammograms, and to derive substantial theoretical results indicating potential avenues of approach toward the three-class classification task. Specifically, we have shown that an ideal observer in an N-class classification task achieves an optimal ROC hypersurface, just as the two-class ideal observer achieves an optimal ROC curve; and that an obvious generalization of a well-known two-class performance metric, the area under the ROC curve, is not useful as a performance metric in classification tasks with more than two classes. This work is significant for three reasons. First, it involves the explicit estimation of feature-based (as opposed to image-based) ideal observer decision variables in the tasks of detecting and classifying mammographic lesions. Second, it directly addresses the three-class classification task of distinguishing malignant lesions, benign

  9. Personnel planning in general practices: development and testing of a skill mix analysis method.

    NARCIS (Netherlands)

    Eitzen-Strassel, J. von; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; Bakker, D.H. de

    2014-01-01

    Background: General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  10. Personnel planning in general practices : Development and testing of a skill mix analysis method

    NARCIS (Netherlands)

    von Eitzen-Strassel, J.; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; de Bakker, D.H.

    2014-01-01

    Background General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  11. General Education Today. A Critical Analysis of Controversies, Practices, and Reforms.

    Science.gov (United States)

    Gaff, Jerry G.

    The range of controversies and changes emerging from the current revival of general education are examined, and many ideas, examples, and recommendations for achieving realistic and successful curricular reform are offered. Instead of either offering an apology for general education or advocating any particular approach, the book draws on solid…

  12. Determinants of the range of drugs prescribed in general practice: a cross-sectional analysis.

    NARCIS (Netherlands)

    Bakker, D.H. de; Coffie, D.S.V.; Heerdink, E.R.; Dijk, L. van; Groenewegen, P.P.

    2007-01-01

    BACKGROUND: Current health policies assume that prescribing is more efficient and rational when general practitioners (GPs) work with a formulary or restricted drugs lists and thus with a limited range of drugs. Therefore we studied determinants of the range of drugs prescribed by general

  13. Interpreted consultations as ‘business as usual’? : An analysis of organizational routines in general practices

    NARCIS (Netherlands)

    Greenhagh, T.; Voisey, C.J.; Robb, N.

    2007-01-01

    UK general practices operate in an environment of high linguistic diversity, because of recent large-scale immigration and of the NHS's commitment to provide a professional interpreter to any patient if needed. Much activity in general practice is co-ordinated and patterned into organisational

  14. Studying emotion theories through connectivity analysis: Evidence from generalized psychophysiological interactions and graph theory.

    Science.gov (United States)

    Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu

    2018-05-15

    Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Impact of a carbon tax on the Chilean economy: A computable general equilibrium analysis

    International Nuclear Information System (INIS)

    García Benavente, José Miguel

    2016-01-01

    In 2009, the government of Chile announced their official commitment to reduce national greenhouse gas emissions by 20% below a business-as-usual projection by 2020. Due to the fact that an effective way to reduce emissions is to implement a national carbon tax, the goal of this article is to quantify the value of a carbon tax that will allow the achievement of the emission reduction target and to assess its impact on the economy. The approach used in this work is to compare the economy before and after the implementation of the carbon tax by creating a static computable general equilibrium model of the Chilean economy. The model developed here disaggregates the economy in 23 industries and 23 commodities, and it uses four consumer agents (households, government, investment, and the rest of the world). By setting specific production and consumptions functions, the model can assess the variation in commodity prices, industrial production, and agent consumption, allowing a cross-sectoral analysis of the impact of the carbon tax. The benchmark of the economy, upon which the analysis is based, came from a social accounting matrix specially constructed for this model, based on the year 2010. The carbon tax was modeled as an ad valorem tax under two scenarios: tax on emissions from fossil fuels burned only by producers and tax on emissions from fossil fuels burned by producers and households. The abatement cost curve has shown that it is more cost-effective to tax only producers, rather than to tax both producers and households. This is due to the fact that when compared to the emission level observed in 2010, a 20% emission reduction will cause a loss in GDP of 2% and 2.3% respectively. Under the two scenarios, the tax value that could lead to that emission reduction is around 26 US dollars per ton of CO_2-equivalent. The most affected productive sectors are oil refinery, transport, and electricity — having a contraction between 7% and 9%. Analyzing the electricity

  16. Generalized Michailov plot analysis of inband E2 transitions of deformed nuclei

    International Nuclear Information System (INIS)

    Long, G.L.; Zhang, W.L.; Ji, H.Y.; Gao, J.F.

    1998-01-01

    Intraband E2 transitions of some 30 deformed nuclei are analysed using a generalized Michailov plot, based on an E2 transition formula in the SU(3) limit of the sdg interacting boson model. The general E2 transition formula in the sdg-IBM has an L(L+3) term in addition to the usual SU(3) model result. It is found that the general E2 formula can describe the inband transitions well. Comparisons with other models are made. The implications of the results are also discussed. (author)

  17. Trade liberalization, the Mercosur integration process and the agriculture-industry transfers: a general equilibrium analysis

    Directory of Open Access Journals (Sweden)

    Joaquim Bento de Souza Ferreira Filho

    1999-12-01

    Full Text Available This paper deals with the effects of trade liberalization and Mercosur integration process upon the Brazilian economy, with emphasis on the agricultural and agroindustrial production sectors, under the hypothesis that those phenomena could be another step in the rural-urban transfer process in Brazil. The analysis is conducted through an applied general equilibrium model. Results suggest that trade liberalization would hardly generate a widespread process of rural-urban transfers, although Brazilian agriculture shows up as a loser in the process. Notwithstanding that fact, there are transfers inside the agricultural sectors, where, besides the losses in the value added of the grain production sectors, there would be gains for the livestock and for the ''other crops" sectors. The agroindustry, in contrast, seems to gain both in Brazil and Argentina. Model results suggest yet that the Brazilian society would be benefitted as a whole by the integration, despite the losses in the agricultural sector.Este artigo analisa os efeitos do processo de liberalização comercial e de constituição do Mercosul sobre a economia brasileira, com ênfase nos setores produtivos da agricultura e da agroindústria, sob a hipótese de que aqueles fenômenos seriam mais uma etapa no processo de transferências rurais-urbanas no Brasil. Para tanto, a análise é conduzida através do uso de um modelo de equilíbrio geral aplicado. Os resultados sugerem que a integração comercial não irá gerar um processo amplo de transferências rurais-urbanas no Brasil, embora a agricultura brasileira apareça, no agregado, como o setor perdedor na integração, em benefício da agricultura argentina. Há, entretanto, transferências dentro dos setores da agropecuária brasileira, onde, ao lado das perdas no valor adicionado do setor produtor de grãos, haveria ganhos para a pecuária e para o setor ''outras culturas". A agroindústria, em contraste, parece ganhar tanto no Brasil

  18. [Teaching basic life support to the general population. Alumni intervention analysis].

    Science.gov (United States)

    Díaz-Castellanos, M A; Fernández-Carmona, A; Díaz-Redondo, A; Cárdenas-Cruz, A; García-del Moral, R; Martín-Lopez, J; Díaz-Redondo, T

    2014-12-01

    The aim of this study was to investigate the rate at which the alumni of basic life support courses witnessed and intervened in out-of-hospital emergency situations, and to identify the variables characterizing those alumni associated with a greater number of witnessing events and interventions. An analysis of the efficiency of the courses was also carried out. A descriptive, cross-sectional study was made. A district in the province of Almería (Spain). Alumni of a mass basic life support training program targeted to the general population «Plan Salvavidas» conducted between 2003-2009. In 2010 the alumni were administered a telephone survey asking whether they had witnessed an emergency situation since attending the program, with the collection of information related to this emergency situation. Rate of out-of-hospital emergencies witnessed by the alumni. Rate of intervention of the alumni in emergency situations. Variables characterizing alumni with a greater likelihood of witnessing an emergency situation. A total of 3,864 trained alumni were contacted by telephone. Of 1,098 respondents, 63.9% were women, and the mean age was 26.61±10.6 years. Of these alumni, 11.75% had witnessed emergency situations, an average of three years after completing the course. Of these emergencies, 23.3% were identified as cardiac arrest. The alumni intervened in 98% of the possible cases. In 63% of the cases, there was no connection between the alumni and the victim. The majority of the emergency situations occurred in the street and in public spaces. A greater likelihood of witnessing an emergency situation was associated with being a healthcare worker and with being over 18 years of age. The rate of out-of-hospital emergencies witnessed by these alumni after the course was 11.75%. The level of intervention among the alumni was high. The most efficient target population consisted of healthcare workers. Copyright © 2013 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  19. Regional versus General Anesthesia for Percutaneous Nephrolithotomy: A Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Henglong Hu

    Full Text Available To compare the effectiveness and safety of regional anesthesia (RA and general anesthesia (GA for percutaneous nephrolithotomy (PNL.PubMed, EMBASE, The Cochrane Library, and the Web of Knowledge databases were systematically searched to identify relevant studies. After literature screening and data extraction, a meta-analysis was performed using the RevMan 5.3 software.Eight randomized controlled trials (RCTs and six non-randomized controlled trials (nRCTs involving 2270 patients were included. Patients receiving RA were associated with shorter operative time (-6.22 min; 95%CI, -9.70 to -2.75; p = 0.0005, lower visual analgesic score on the first and third postoperative day (WMD, -2.62; 95%CI, -3.04 to -2.19; p < 0.00001 WMD, -0.38; 95%CI, -0.58 to -0.18; p = 0.0002, less analgesic requirements (WMD, -59.40 mg; 95%CI, -78.39 to -40.40; p<0.00001, shorter hospitalization (WMD, -0.36d; 95%CI, -0.66 to -0.05; p = 0.02, less blood transfusion (RR, 0.61; 95%CI, 0.41 to 0.93; p = 0.02, fewer modified Clavion-Dindo Grade II (RR, 0.56; 95%CI, 0.37 to 0.83; p = 0.005, Grade III or above postoperative complications (RR, 0.51; 95%CI, 0.33 to 0.77; p = 0.001, and potential benefits of less fever (RR, 0.79; 95%CI, 0.61 to 1.02; p = 0.07, nausea or vomiting (RR, 0.54; 95%CI, 0.20 to 1.46; p = 0.23, whereas more intraoperative hypotension (RR, 3.13; 95%CI, 1.76 to 5.59; p = 0.0001 when compared with patients receiving GA. When nRCTs were excluded, most of the results were stable but the significant differences were no longer detectable in blood transfusion, Grade II and more severe complications. No significant difference in the total postoperative complications and stone-free rate were found.Current evidence suggests that both RA and GA can provide safe and effective anesthesia for PNL in carefully evaluated and selected patients. Each anesthesia technique has its own advantages but some aspects still remain unclear and need to be explored in future studies.

  20. Mortality Analysis of Trauma Patients in General Intensive Care Unit of a State Hospital

    Directory of Open Access Journals (Sweden)

    İskender Kara

    2015-08-01

    Full Text Available Objective: The aim of this study was to determine the mortality rate and factors affecting the mortality of trauma patients in general intensive care unit (ICU of a state hospital. Material and Method: Data of trauma patients hospitalized between January 2012 and March 2013 in ICU of Konya Numune Hospital were retrospectively analyzed. Demographic characteristics and clinical data of patients were recorded. Patients were divided into two groups as survivors and dead. Mortality rate and factors affectin mortality were examined. Results: A total of 108 trauma patients were included in the study. The mortality rate of overall group was 19.4%. Median age of the patients was 44.5 years and 75.9% of them were males. Median Glasgow Coma Scale of death group was lower (5 (3-8 vs. 15 (13-15, p<0.0001, median APACHE II score was higher (20 (15-26 vs. 10 (8-13, p<0.0001 and median duration of ICU stay was longer (27 (5-62,5 vs. 2 (1-5, p<0.0001 than those in the survival group. The most common etiology of trauma was traffic accidents (47.2% and 52.7% of patients had head trauma. The rate of patients with any fracture was significantly higher in the survival group (66.7% vs. 33.3%, p=0.007. The rate of erythrocyte suspension, fresh frozen plasma, trombocyte suspension and albumin were 38.9%, 27.8%, 0.9% and 8.3%, respectively in all group. The number of patients invasive mechanically ventilated was 27.8% and median length of stay of these patients were 5 (1.75-33.5 days. The rate of operated patients was 42.6%. The rate of tracheostomy, renal replacement therapy, bronchoscopy and percutaneous endoscopic gastrostomy enforcements were higher in the death group. The advanced age (p=0.016, OR: 1.054; 95% CI: 1.010-1100 and low GCS (p<0.0001, OR: 0.583; 95% CI: 0.456-0.745 were found to be independent risk factors the ICU mortality of trauma patients in logistic regression analysis. Conclusion: We believe that the determination of these risk factors affecting

  1. Productive whole-class discussions: A qualitative analysis of peer leader behaviors in general chemistry

    Science.gov (United States)

    Eckart, Teresa Mcclain

    The intention of this research was to describe behaviors and characteristics of General Chemistry I peer leaders using a pedagogical reform method referred to as Peer-led Guided Inquiry (PLGI), and to discuss the ways in which these peer leaders created productive whole-class discussions. This reform technique engaged students to work on guided inquiry activities while working cooperatively in small groups, led by undergraduate peer leaders. These sessions were video recorded and transcribed. The data was evaluated using grounded theory methods of analysis. This study examined the dialog between students and peer leaders, paying specific attention to question types and observed patterns of interactions. The research took shape by examining the kinds of questions asked by peer leaders and the purposes these questions served. In addition to looking at questions, different kinds of behaviors displayed by peer leaders during their small group sessions were also observed. A close examination of peer leader questions and behaviors aided in developing an answer to the overall research question regarding what factors are associated with productive whole-class discussions. Five major categories of peer leader behaviors evolved from the data and provided a means to compare and contrast productive whole-class discussions. While no category single-handedly determined if a discussion was good or bad, there was a tendency for peer leaders who exhibited positive traits in at least three of the following categories to have consistently better whole-class discussions: Procedural Practices, Supervisory Qualities, Questioning Techniques, Feedback/Responses, and Interpersonal Skills. Furthermore, each of the major categories is tied directly to Interpersonal, Communication, and Leadership skills and their interactions with each other. This study also addressed applications that each of these categories has on instructional practices and their need in peer leader training. In addition

  2. Dual-Tracer PET Using Generalized Factor Analysis of Dynamic Sequences

    Science.gov (United States)

    Fakhri, Georges El; Trott, Cathryn M.; Sitek, Arkadiusz; Bonab, Ali; Alpert, Nathaniel M.

    2013-01-01

    Purpose With single-photon emission computed tomography, simultaneous imaging of two physiological processes relies on discrimination of the energy of the emitted gamma rays, whereas the application of dual-tracer imaging to positron emission tomography (PET) imaging has been limited by the characteristic 511-keV emissions. Procedures To address this limitation, we developed a novel approach based on generalized factor analysis of dynamic sequences (GFADS) that exploits spatio-temporal differences between radiotracers and applied it to near-simultaneous imaging of 2-deoxy-2-[18F]fluoro-D-glucose (FDG) (brain metabolism) and 11C-raclopride (D2) with simulated human data and experimental rhesus monkey data. We show theoretically and verify by simulation and measurement that GFADS can separate FDG and raclopride measurements that are made nearly simultaneously. Results The theoretical development shows that GFADS can decompose the studies at several levels: (1) It decomposes the FDG and raclopride study so that they can be analyzed as though they were obtained separately. (2) If additional physiologic/anatomic constraints can be imposed, further decomposition is possible. (3) For the example of raclopride, specific and nonspecific binding can be determined on a pixel-by-pixel basis. We found good agreement between the estimated GFADS factors and the simulated ground truth time activity curves (TACs), and between the GFADS factor images and the corresponding ground truth activity distributions with errors less than 7.3±1.3 %. Biases in estimation of specific D2 binding and relative metabolism activity were within 5.9±3.6 % compared to the ground truth values. We also evaluated our approach in simultaneous dual-isotope brain PET studies in a rhesus monkey and obtained accuracy of better than 6 % in a mid-striatal volume, for striatal activity estimation. Conclusions Dynamic image sequences acquired following near-simultaneous injection of two PET radiopharmaceuticals

  3. Pilot Designed Aircraft Displays in General Aviation: An Exploratory Study and Analysis

    Science.gov (United States)

    Conaway, Cody R.

    From 2001-2011, the General Aviation (GA) fatal accident rate remained unchanged (Duquette & Dorr, 2014) with an overall stagnant accident rate between 2004 and 2013. The leading cause, loss of control in flight (NTSB, 2015b & 2015c) due to pilot inability to recognize approach to stall/spin conditions (NTSB, 2015b & 2016b). In 2013, there were 1,224 GA accidents in the U.S., accounting for 94% of all U.S. aviation accidents and 90% of all U.S. aviation fatalities that year (NTSB, 2015c). Aviation entails multiple challenges for pilots related to task management, procedural errors, perceptual distortions, and cognitive discrepancies. While machine errors in airplanes have continued to decrease over the years, human error still has not (NTSB, 2013). A preliminary analysis of a PC-based, Garmin G1000 flight deck was conducted with 3 professional pilots. Analyses revealed increased task load, opportunities for distraction, confusing perceptual ques, and hindered cognitive performance. Complex usage problems were deeply ingrained in the functionality of the system, forcing pilots to use fallible work arounds, add unnecessary steps, and memorize knob turns or button pushes. Modern computing now has the potential to free GA cockpit designs from knobs, soft keys, or limited display options. Dynamic digital displays might include changes in instrumentation or menu structuring depending on the phase of flight. Airspeed indicators could increase in size to become more salient during landing, simultaneously highlighting pitch angle on Attitude Indicators and automatically decluttering unnecessary information for landing. Likewise, Angle-of-Attack indicators demonstrate a great safety and performance advantage for pilots (Duquette & Dorr, 2014; NTSB, 2015b & 2016b), an instrument typically found in military platforms and now the Icon A5, light-sport aircraft (Icon, 2016). How does the design of pilots' environment---the cockpit---further influence their efficiency and

  4. Analysis of General Accounting Office Bid Protest Decisions on A-76 Studies

    National Research Council Canada - National Science Library

    Russial, Paul

    2003-01-01

    .... Historically, industry has successfully protested a high percentage of A-76 procurements. This thesis examines General Accounting Office A-76 bid protest decisions issued between 5 February 1996 and 23 December 2002...

  5. Generalized Frequency-Domain Correlator for Software GPS Receiver: Preliminary Test Results and Analysis (PREPRINT)

    National Research Council Canada - National Science Library

    Yang, Chun; Miller, Mikel; Nguyen, Thao; Akos, Dennis

    2006-01-01

    .... The use of a GFDC can offer several advantages. First, as a generalization of the FFT-implemented correlation with a block repetitive processing capability, it enables fast acquisition through simultaneous code delay and Doppler frequency search...

  6. The Prussian and American General Staffs: An Analysis of Cross-Cultural Imitation, Innovation, and Adaptation

    Science.gov (United States)

    1981-03-30

    Weimar period, but the Prussian general staff would never again control the military destinies of the German people. Before its defeat and dissolution...Publishers (New York: International Publishers, 1925), p. 4. 90. Norman Stone, The Eastern Front 1914-1917 (London: Hodder and Stoughton, 1975), p. 61...Lieutenant (later Brigadier General) Arthur L. Wagner was assigned to the school as an instructor in the Department of Military Art. Wagner , a 1875

  7. Health sciences libraries' subscriptions to journals: expectations of general practice departments and collection-based analysis.

    Science.gov (United States)

    Barreau, David; Bouton, Céline; Renard, Vincent; Fournier, Jean-Pascal

    2018-04-01

    The aims of this study were to (i) assess the expectations of general practice departments regarding health sciences libraries' subscriptions to journals and (ii) describe the current general practice journal collections of health sciences libraries. A cross-sectional survey was distributed electronically to the thirty-five university general practice departments in France. General practice departments were asked to list ten journals to which they expected access via the subscriptions of their health sciences libraries. A ranked reference list of journals was then developed. Access to these journals was assessed through a survey sent to all health sciences libraries in France. Adequacy ratios (access/need) were calculated for each journal. All general practice departments completed the survey. The total reference list included 44 journals. This list was heterogeneous in terms of indexation/impact factor, language of publication, and scope (e.g., patient care, research, or medical education). Among the first 10 journals listed, La Revue Prescrire (96.6%), La Revue du Praticien-Médecine Générale (90.9%), the British Medical Journal (85.0%), Pédagogie Médicale (70.0%), Exercer (69.7%), and the Cochrane Database of Systematic Reviews (62.5%) had the highest adequacy ratios, whereas Family Practice (4.2%), the British Journal of General Practice (16.7%), Médecine (29.4%), and the European Journal of General Practice (33.3%) had the lowest adequacy ratios. General practice departments have heterogeneous expectations in terms of health sciences libraries' subscriptions to journals. It is important for librarians to understand the heterogeneity of these expectations, as well as local priorities, so that journal access meets users' needs.

  8. A General Analysis of the Impact of Digitization in Microwave Correlation Radiometers

    Directory of Open Access Journals (Sweden)

    Hyuk Park

    2011-06-01

    Full Text Available This study provides a general framework to analyze the effects on correlation radiometers of a generic quantization scheme and sampling process. It reviews, unifies and expands several previous works that focused on these effects separately. In addition, it provides a general theoretical background that allows analyzing any digitization scheme including any number of quantization levels, irregular quantization steps, gain compression, clipping, jitter and skew effects of the sampling period.

  9. General analysis for experimental studies of time-reversal-violating effects in slow neutron propagation through polarized matter

    International Nuclear Information System (INIS)

    Lamoreaux, S.K.; Golub, R.

    1994-01-01

    A general technique is developed for the analysis of proposed experimental studies of possible P,T-violating effects in the neutron-nucleus interaction based on low-energy neutron transmission through polarized matter. The analysis is applied to proposed experimental schemes and we determine the levels at which the absolute neutron polarization, magnetic fields, and target polarization must be controlled in order for these experiments to obtain a given sensitivity to P,T-violating effects

  10. Semiquantitative analysis of interictal glucose metabolism between generalized epilepsy and localization related epilepsy

    International Nuclear Information System (INIS)

    Hikima, Akio; Mochizuki, Hiroyuki; Oriuchi, Noboru; Endo, Keigo; Morikawa, Akihiro

    2004-01-01

    Positron emission tomography (PET) with [ 18 F]fluoro-D-deoxyglucose (FDG) has been used to detect seizure foci and evaluate surgical resection with localization related epilepsies. However, few investigations have focused on generalized epilepsy in children. To reveal the pathophysiology of generalized epilepsy, we studied 11 patients with generalized epilepsy except West syndrome, and 11 patients with localization related epilepsy without organic disease. The FDG PET was performed by simultaneous emission and transmission scanning. We placed regions of interest (ROI) on bilateral frontal lobe, parietal lobe, occipital lobe, temporal lobe, basal ganglia, thalamus and cerebellum. Standardized uptake value (SUV) was measured and normalized to SUV of ipsilateral cerebellum. Then, we compared the data of generalized epilepsy to those of localization related epilepsy. FDG PET revealed significant interictal glucose hypometabolism in bilateral basal ganglia in generalized epilepsy compared to that in localization related epilepsy (right side: p=0.0095, left side: p=0.0256, Mann-Whitney test). No other region showed any significant difference (p>0.05) between the two groups. These findings indicate that the basal ganglia is involved in the outbreak of generalized seizures or is affected secondarily by the epileptogenicity itself. (author)

  11. Accounting for center in the Early External Cephalic Version trials: an empirical comparison of statistical methods to adjust for center in a multicenter trial with binary outcomes.

    Science.gov (United States)

    Reitsma, Angela; Chu, Rong; Thorpe, Julia; McDonald, Sarah; Thabane, Lehana; Hutton, Eileen

    2014-09-26

    Clustering of outcomes at centers involved in multicenter trials is a type of center effect. The Consolidated Standards of Reporting Trials Statement recommends that multicenter randomized controlled trials (RCTs) should account for center effects in their analysis, however most do not. The Early External Cephalic Version (EECV) trials published in 2003 and 2011 stratified by center at randomization, but did not account for center in the analyses, and due to the nature of the intervention and number of centers, may have been prone to center effects. Using data from the EECV trials, we undertook an empirical study to compare various statistical approaches to account for center effect while estimating the impact of external cephalic version timing (early or delayed) on the outcomes of cesarean section, preterm birth, and non-cephalic presentation at the time of birth. The data from the EECV pilot trial and the EECV2 trial were merged into one dataset. Fisher's exact method was used to test the overall effect of external cephalic version timing unadjusted for center effects. Seven statistical models that accounted for center effects were applied to the data. The models included: i) the Mantel-Haenszel test, ii) logistic regression with fixed center effect and fixed treatment effect, iii) center-size weighted and iv) un-weighted logistic regression with fixed center effect and fixed treatment-by-center interaction, iv) logistic regression with random center effect and fixed treatment effect, v) logistic regression with random center effect and random treatment-by-center interaction, and vi) generalized estimating equations. For each of the three outcomes of interest approaches to account for center effect did not alter the overall findings of the trial. The results were similar for the majority of the methods used to adjust for center, illustrating the robustness of the findings. Despite literature that suggests center effect can change the estimate of effect in

  12. The Study About Physical Activity for Subjects With Prevention of Benign Prostate Hyperplasia

    Directory of Open Access Journals (Sweden)

    Ho Won Lee

    2014-09-01

    Full Text Available PurposeThe number of benign prostatic hyperplasia (BPH subjects has been increasing worldwide, and many studies have been conducted to determine the treatment that can delay drug therapy or surgery. Subsequently, most of these studies involved physical activity (PA and associated factors. Therefore, we aimed to determine factors associated with BPH prevalence based on a review of past and present studies and to investigate the effect of a healthy lifestyle as a protective factor of BPH occurrence.MethodsWe selected 582 subjects aged ≥40 years from an initial 779 subjects recruited from Gyeonggi, Yangpyeong, South Korea, during August 2009 to August 2011. Trained investigators surveyed International Prostate Symptom Score and demographic information, including PA and lifestyle questionnaire during face-to-face interviews; further, they performed digital rectal examination, rectal ultrasonography, and measured prostate-specific antigen levels. The statistical association between PA and BPH was analyzed by logistic regression analysis using multivariable regression models which use categorical variables by the Cochran-Mantel-Haenszel test and continuous variables by the general linear model.ResultsSeven statistically significant variables for PA were selected. Regular exercise, frequency of exercise, sedentary time, nonsedentary time, leisure time PA (metabolic equivalent, hr/wk were not statistically associated with prostate volume but sedentary time (hr/day was the only factor that showed a significant association in the multivariable model, including a linear effect relationship. Subjects with lower levels of sedentary time (4.5-7.0 hr/day had a significantly lower risk of BPH (odds ratio [OR], 0.93; 95% confidence interval [CI], 0.52-1.67 than those with a higher sedentary time (>7 hr/day (OR, 1.72; 95% CI, 0.96-3.09 (P for trend=0.05.ConclusionsOur study showed that reducing sedentary time could have a protective effect and reduce the

  13. A novel ergodic capacity analysis of diversity combining and multihop transmission systems over generalized composite fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    Ergodic capacity is an important performance measure associated with reliable communication at the highest rate at which information can be sent over the channel with a negligible probability of error. In the shadow of this definition, diversity receivers (such as selection combining, equal-gain combining and maximal-ratio combining) and transmission techniques (such as cascaded fading channels, amplify-and-forward multihop transmission) are deployed in mitigating various performance impairing effects such as fading and shadowing in digital radio communication links. However, the exact analysis of ergodic capacity is in general not always possible for all of these forms of diversity receivers and transmission techniques over generalized composite fading environments due to it\\'s mathematical intractability. In the literature, published papers concerning the exact analysis of ergodic capacity have been therefore scarce (i.e., only [1] and [2]) when compared to those concerning the exact analysis of average symbol error probability. In addition, they are essentially targeting to the ergodic capacity of the maximal ratio combining diversity receivers and are not readily applicable to the capacity analysis of the other diversity combiners / transmission techniques. In this paper, we propose a novel moment generating function-based approach for the exact ergodic capacity analysis of both diversity receivers and transmission techniques over generalized composite fading environments. As such, we demonstrate how to simultaneously treat the ergodic capacity analysis of all forms of both diversity receivers and multihop transmission techniques. © 2012 IEEE.

  14. A more general model for the analysis of the rock slope stability

    Indian Academy of Sciences (India)

    slope stability analysis, the joint surfaces are assumed to be continuous along the potential ... of rock slope stability has many applications in the design of rock slopes, roofs and walls of .... cases the wedge failure analysis can be applied.

  15. Analysis of Cine-Psychometric Visual Memory Data by the Tucker Generalized Learning Curve Method: Final Report.

    Science.gov (United States)

    Reid, J. C.; Seibert, Warren F.

    The analysis of previously obtained data concerning short-term visual memory and cognition by a method suggested by Tucker is proposed. Although interesting individual differences undoubtedly exist in people's ability and capacity to process short-term visual information, studies have not generally examined these differences. In fact, conventional…

  16. The General Factor of Personality: A meta-analysis of Big Five intercorrelations and a criterion-related validity study

    NARCIS (Netherlands)

    van der Linden, D.; te Nijenhuis, J.; Bakker, A.B.

    2010-01-01

    Recently, it has been proposed that a General Factor of Personality (GFP) occupies the top of the hierarchical personality structure. We present a meta-analysis (K = 212, total N = 144,117) on the intercorrelations among the Big Five personality factors (Openness, Conscientiousness, Extraversion,

  17. Engagement and Disengagement between Special and General Educators: An Application of Miles and Huberman's Cross-Case Analysis.

    Science.gov (United States)

    Gersten, Russell; Marks, Susan Unok

    1998-01-01

    This cross-case analysis examined general education elementary-level teacher engagement with and effectiveness of "coaching," or expert consultation by special educators on effective teaching strategies. Factors resulting in high levels of impact on teaching and high levels of engagement were identified, including emphasis on conceptual…

  18. The Essences of Culinary Arts Students' Lived Experience of General Education Online Learning: An Interpretive Phenomenological Analysis

    Science.gov (United States)

    Keovilay, Sisavath

    2015-01-01

    This phenomenological research study explored the lived experiences of culinary arts students learning general education online while enrolled in a face-to-face (f2f) culinary arts class. This research used Interpretative Phenomenological Analysis (IPA) to analyze how culinary arts students, in a not-for-profit Florida University, made sense of…

  19. Full Wave Analysis of Passive Microwave Monolithic Integrated Circuit Devices Using a Generalized Finite Difference Time Domain (GFDTD) Algorithm

    Science.gov (United States)

    Lansing, Faiza S.; Rascoe, Daniel L.

    1993-01-01

    This paper presents a modified Finite-Difference Time-Domain (FDTD) technique using a generalized conformed orthogonal grid. The use of the Conformed Orthogonal Grid, Finite Difference Time Domain (GFDTD) enables the designer to match all the circuit dimensions, hence eliminating a major source o error in the analysis.

  20. GENOVA: a generalized perturbation theory program for various applications to CANDU core physics analysis (II) - a user's manual

    International Nuclear Information System (INIS)

    Kim, Do Heon; Choi, Hang Bok

    2001-03-01

    A user's guide for GENOVA, a GENeralized perturbation theory (GPT)-based Optimization and uncertainty analysis program for Canada deuterium uranium (CANDU) physics VAriables, was prepared. The program was developed under the framework of CANDU physics design and analysis code RFSP. The generalized perturbation method was implemented in GENOVA to estimate the zone controller unit (ZCU) level upon refueling operation and calculate various sensitivity coefficients for fuel management study and uncertainty analyses, respectively. This documentation contains descriptions and directions of four major modules of GENOVA such as ADJOINT, GADJINT, PERTURB, and PERTXS so that it can be used as a practical guide for GENOVA users. This documentation includes sample inputs for the ZCU level estimation and sensitivity coefficient calculation, which are the main application of GENOVA. The GENOVA can be used as a supplementary tool of the current CANDU physics design code for advanced CANDU core analysis and fuel development

  1. A General Finite Element Scheme for Limit State Analysis and Optimization

    DEFF Research Database (Denmark)

    Damkilde, Lars

    1999-01-01

    Limit State analysis which is based on a perfect material behaviour is used in many different applications primarily within Structural Engineering and Geotechnics. The calculation methods have not reached the same level of automation such as Finite Element Analysis for elastic structures....... The computer based systems are more ad hoc based and are typically not well-integrated with pre- and postprocessors well-known from commercial Finite Element codes.A finite element based formulation of limit state analysis is presented which allows an easy integration with standard Finite Element codes...... for elastic analysis. In this way the user is able to perform a limit state analysis on the same model used for elastic analysis only adding data for the yield surface.The method is based on the lower-bound theorem and uses stress-based elements with a linearized yield surface. The mathematical problem...

  2. Psychometrical Assessment and Item Analysis of the General Health Questionnaire in Victims of Terrorism

    Science.gov (United States)

    Delgado-Gomez, David; Lopez-Castroman, Jorge; de Leon-Martinez, Victoria; Baca-Garcia, Enrique; Cabanas-Arrate, Maria Luisa; Sanchez-Gonzalez, Antonio; Aguado, David

    2013-01-01

    There is a need to assess the psychiatric morbidity that appears as a consequence of terrorist attacks. The General Health Questionnaire (GHQ) has been used to this end, but its psychometric properties have never been evaluated in a population affected by terrorism. A sample of 891 participants included 162 direct victims of terrorist attacks and…

  3. A Molecular Analysis of Training Multiple versus Single Manipulations to Establish a Generalized Manipulative Imitation Repertoire

    Science.gov (United States)

    Hartley, Breanne K.

    2009-01-01

    This study evaluates the necessity of training multiple versus single manipulative-imitations per object in order to establish generalized manipulative-imitation. Training took place in Croyden Avenue School's Early Childhood Developmental Delay preschool classroom in Kalamazoo, MI. Two groups of 3 children each were trained to imitate in order to…

  4. Modeling containment of large wildfires using generalized linear mixed-model analysis

    Science.gov (United States)

    Mark Finney; Isaac C. Grenfell; Charles W. McHugh

    2009-01-01

    Billions of dollars are spent annually in the United States to contain large wildland fires, but the factors contributing to suppression success remain poorly understood. We used a regression model (generalized linear mixed-model) to model containment probability of individual fires, assuming that containment was a repeated-measures problem (fixed effect) and...

  5. Replacing Maladaptive Speech with Verbal Labeling Responses: An Analysis of Generalized Responding.

    Science.gov (United States)

    Foxx, R. M.; And Others

    1988-01-01

    Three mentally handicapped students (aged 13, 36, and 40) with maladaptive speech received training to answer questions with verbal labels. The results of their cues-pause-point training showed that the students replaced their maladaptive speech with correct labels (answers) to questions in the training setting and three generalization settings.…

  6. Parent Ratings of ADHD Symptoms: Generalized Partial Credit Model Analysis of Differential Item Functioning across Gender

    Science.gov (United States)

    Gomez, Rapson

    2012-01-01

    Objective: Generalized partial credit model, which is based on item response theory (IRT), was used to test differential item functioning (DIF) for the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed.), inattention (IA), and hyperactivity/impulsivity (HI) symptoms across boys and girls. Method: To accomplish this, parents completed…

  7. Integrating Biology into the General Chemistry Laboratory: Fluorometric Analysis of Chlorophyll "a"

    Science.gov (United States)

    Wesolowski, Meredith C.

    2014-01-01

    A laboratory experiment that introduces fluorometry of chlorophyll "a" at the general chemistry level is described. The use of thin-layer chromatography to isolate chlorophyll "a" from spirulina and leaf matter enables quantification of small amounts of chlorophyll "a" via fluorometry. Student results were reasonably…

  8. Semi-analytical stochastic analysis of the generalized van der Pol system

    Czech Academy of Sciences Publication Activity Database

    Náprstek, Jiří; Fischer, Cyril

    (2018) ISSN 1802-680X R&D Projects: GA ČR(CZ) GA15-01035S Institutional support: RVO:68378297 Keywords : stochastic stability * generalized van der Pol system * stochastic averaging * limit cycles Subject RIV: JM - Building Engineering OBOR OECD: Construction engineering, Municipal and structural engineering https://www.kme.zcu.cz/acm/acm/article/view/407

  9. Regional disaster impact analysis: comparing Input-Output and Computable General Equilibrium models

    NARCIS (Netherlands)

    Koks, E.E.; Carrera, L.; Jonkeren, O.; Aerts, J.C.J.H.; Husby, T.G.; Thissen, M.; Standardi, G.; Mysiak, J.

    2016-01-01

    A variety of models have been applied to assess the economic losses of disasters, of which the most common ones are input-output (IO) and computable general equilibrium (CGE) models. In addition, an increasing number of scholars have developed hybrid approaches: one that combines both or either of

  10. Stability analysis of a general age-dependent vaccination model of a vertically transmitted disease

    International Nuclear Information System (INIS)

    El Doma, M.

    1995-07-01

    An SIR epidemic model of a general age-dependent vaccination of a vertically as well as horizontally transmitted disease is investigated when the population is in steady state and the fertility, mortality and removal rates depends on age. We determine the steady states and examine their stabilities. (author). 24 refs

  11. An Analysis of Undergraduate General Chemistry Students' Misconceptions of the Submicroscopic Level of Precipitation Reactions

    Science.gov (United States)

    Kelly, Resa M.; Barrera, Juliet H.; Mohamed, Saheed C.

    2010-01-01

    This study examined how 21 college-level general chemistry students, who had received instruction that emphasized the symbolic level of ionic equations, explained their submicroscopic-level understanding of precipitation reactions. Students' explanations expressed through drawings and semistructured interviews revealed the nature of the…

  12. Awareness, Analysis, and Action: Curricular Alignment for Student Success in General Chemistry

    Science.gov (United States)

    Jewitt, Sarah; Sutphin, Kathy; Gierasch, Tiffany; Hamilton, Pauline; Lilly, Kathleen; Miller, Kristine; Newlin, Donald; Pires, Richard; Sherer, Maureen; LaCourse, William R.

    2018-01-01

    This article examines the ways that a shared faculty experience across five partner institutions led to a deep awareness of the curriculum and pedagogy of general chemistry coursework, and ultimately, to a collaborative action plan for student success. The team identified key differences and similarities in course content and instructional…

  13. Presentation and analysis of a general algorithm for risk-assessment on secondary poisoning

    NARCIS (Netherlands)

    Romijn CAFM; Luttik R; van de Meent D; Slooff W; Canton JH

    1991-01-01

    The study in this report was carried out in the frame of the project "Evaluation system for new chemical substances". The aim of the study was to present a general algorithm for risk-assessment on secondary poisoning of birds and mammals. Risk-assessment on secondary poisoning can be an

  14. Usability of teleshopping systems by young and older adults : General performance, task analysis and subjective evaluation

    NARCIS (Netherlands)

    de Raad, KJE; Dekker, MR; Sikken, J.A.; den Brinker, P.B.L.M.; Beek, PJ; Brand, AN; Maarse, FJ; Mulder, LJM

    1999-01-01

    Older people generally experience more difficulty learning to work with new information technologies than younger people. This may be partly due to age-related impairments of memory and information processing. To determine which aspects of user interfaces pose too high demands on order users, an

  15. Lifestyle counseling in hypertension-related visits: analysis of video-taped general practice visits.

    NARCIS (Netherlands)

    Milder, I.E.J.; Blokstra, A.; Groot, J. de; Dulmen, S. van; Bemelmans, W.J.E.

    2008-01-01

    BACKGROUND: The general practitioner (GP) can play an important role in promoting a healthy lifestyle, which is especially relevant in people with an elevated risk of cardiovascular diseases due to hypertension. Therefore, the aim of this study was to determine the frequency and content of lifestyle

  16. A parametric analysis of prospect theory’s functionals for the general population

    NARCIS (Netherlands)

    Booij, A.S.; van Praag, B.M.S.; van de Kuilen, G.

    2010-01-01

    This article presents the results of an experiment that completely measures the utility function and probability weighting function for different positive and negative monetary outcomes, using a representative sample of N = 1,935 from the general public. The results confirm earlier findings in the

  17. An analysis of acute admissions to a general hospital psychiatric unit

    African Journals Online (AJOL)

    Rapid turnover of patients in a general hospital psychiatric unit demands stabilization and discharge as soon as possible. It is likely that patients are being prematurely discharged because of this pressure. Aim: The study sought to analyse admissions to an acute psychiatric unit with a view to determining the demographic ...

  18. Feasibility of bioelectrical impedance analysis in children with a severe generalized cerebral palsy

    NARCIS (Netherlands)

    R.J.G. Veugelers (Rebekka); C. Penning (Corine); L. van Gulik (Laura); D. Tibboel (Dick); H.M. Evenhuis (Heleen)

    2006-01-01

    textabstractObjective: The need is strong for an accurate and easy-to-perform test to evaluate the nutritional state of children who have a severe generalized cerebral palsy, defined as a severe motor handicap and an intellectual disability. For that purpose, we determined the feasibility of

  19. A Measurement Invariance Analysis of the General Self-Efficacy Scale on Two Different Cultures

    Science.gov (United States)

    Teo, Timothy; Kam, Chester

    2014-01-01

    The 10-item General Self-Efficacy Scale (GSES) was developed to assess an individual's beliefs to cope with a variety of situations in life. Despite the GSES being used in numerous research from researchers in different countries and presented in different languages, little is known about the use of its validity in an Asian culture. The aim of the…

  20. Analysis of the System of Racial Quotas in Brazil as Affirmative Action Combined with the General Right to Equality

    Directory of Open Access Journals (Sweden)

    Alisson Magela Moreira Damasceno

    2016-10-01

    Full Text Available This article aims to discuss the system of racial quotas in Brazil, considering them as unequal treatment, which aims to promote equality. In the light of the thought of Alexy, it has been proposed to analyze the General Equality Law, to then justify the reasons for the unequal state treatment. Such unequal treatment in such cases will be addressed from the perspective of affirmative action. These measures are promoted in order to promote social redemption of company shares historically segregated. Thus, this study proposes an analysis of the general right to equality in the construction and application of the law.

  1. International Comparisons through Simultaneous and Conjunct Analysis: A Search for General Relationships across Countries.

    Science.gov (United States)

    Lietz, Petra

    1996-01-01

    The six chapters of this theme issue explore data collected by the International Association for the Evaluation of Educational Achievement to investigate crucial issues in reading comprehension. Simultaneous analysis and conjunct analysis are used to examine models as structured combinations of factors in the search for relationships among…

  2. Interpreted consultations as 'business as usual'? An analysis of organisational routines in general practices.

    Science.gov (United States)

    Greenhalgh, Trisha; Voisey, Christopher; Robb, Nadia

    2007-09-01

    UK general practices operate in an environment of high linguistic diversity, because of recent large-scale immigration and of the NHS's commitment to provide a professional interpreter to any patient if needed. Much activity in general practice is co-ordinated and patterned into organisational routines (defined as repeated patterns of interdependent actions, involving multiple actors, bound by rules and customs) that tend to be stable and to persist. If we want to understand how general practices are responding to pressures to develop new routines, such as interpreted consultations, we need to understand how existing organisational routines change. This will then help us to address a second question, which is how the interpreted consultation itself is being enacted and changing as it becomes routinised (or not) in everyday general practice. In seeking answers to these two questions, we undertook a qualitative study of narratives of interpreted primary care consultations in three London boroughs with large minority ethnic populations. In 69 individual interviews and two focus groups, we sought accounts of interpreted consultations from service users, professional interpreters, family member interpreters, general practitioners, practice nurses, receptionists, and practice managers. We asked participants to tell us both positive and negative stories of their experiences. We analysed these data by searching for instances of concepts relating to the organisational routine, the meaning of the interpreted consultation to the practice, and the sociology of medical work. Our findings identified a number of general properties of the interpreted consultation as an organisational routine, including the wide variation in the form of adoption, the stability of the routine, the adaptability of the routine, and the strength of the routine. Our second key finding was that this variation could be partly explained by characteristics of the practice as an organisation, especially

  3. The evolution of nursing in Australian general practice: a comparative analysis of workforce surveys ten years on.

    Science.gov (United States)

    Halcomb, Elizabeth J; Salamonson, Yenna; Davidson, Patricia M; Kaur, Rajneesh; Young, Samantha Am

    2014-03-25

    Nursing in Australian general practice has grown rapidly over the last decade in response to government initiatives to strengthen primary care. There are limited data about how this expansion has impacted on the nursing role, scope of practice and workforce characteristics. This study aimed to describe the current demographic and employment characteristics of Australian nurses working in general practice and explore trends in their role over time. In the nascence of the expansion of the role of nurses in Australian general practice (2003-2004) a national survey was undertaken to describe nurse demographics, clinical roles and competencies. This survey was repeated in 2009-2010 and comparative analysis of the datasets undertaken to explore workforce changes over time. Two hundred eighty four nurses employed in general practice completed the first survey (2003/04) and 235 completed the second survey (2009/10). Significantly more participants in Study 2 were undertaking follow-up of pathology results, physical assessment and disease specific health education. There was also a statistically significant increase in the participants who felt that further education/training would augment their confidence in all clinical tasks (p nurses' role in general practice decreased between the two time points, more participants perceived lack of space, job descriptions, confidence to negotiate with general practitioners and personal desire to enhance their role as barriers. Access to education and training as a facilitator to nursing role expansion increased between the two studies. The level of optimism of participants for the future of the nurses' role in general practice was slightly decreased over time. This study has identified that some of the structural barriers to nursing in Australian general practice have been addressed over time. However, it also identifies continuing barriers that impact practice nurse role development. Understanding and addressing these issues is vital

  4. The response analysis of fractional-order stochastic system via generalized cell mapping method.

    Science.gov (United States)

    Wang, Liang; Xue, Lili; Sun, Chunyan; Yue, Xiaole; Xu, Wei

    2018-01-01

    This paper is concerned with the response of a fractional-order stochastic system. The short memory principle is introduced to ensure that the response of the system is a Markov process. The generalized cell mapping method is applied to display the global dynamics of the noise-free system, such as attractors, basins of attraction, basin boundary, saddle, and invariant manifolds. The stochastic generalized cell mapping method is employed to obtain the evolutionary process of probability density functions of the response. The fractional-order ϕ 6 oscillator and the fractional-order smooth and discontinuous oscillator are taken as examples to give the implementations of our strategies. Studies have shown that the evolutionary direction of the probability density function of the fractional-order stochastic system is consistent with the unstable manifold. The effectiveness of the method is confirmed using Monte Carlo results.

  5. Symmetry Analysis of Gauge-Invariant Field Equations via a Generalized Harrison-Estabrook Formalism.

    Science.gov (United States)

    Papachristou, Costas J.

    The Harrison-Estabrook formalism for the study of invariance groups of partial differential equations is generalized and extended to equations that define, through their solutions, sections on vector bundles of various kinds. Applications include the Dirac, Yang-Mills, and self-dual Yang-Mills (SDYM) equations. The latter case exhibits interesting connections between the internal symmetries of SDYM and the existence of integrability characteristics such as a linear ("inverse scattering") system and Backlund transformations (BT's). By "verticalizing" the generators of coordinate point transformations of SDYM, nine nonlocal, generalized (as opposed to local, point) symmetries are constructed. The observation is made that the prolongations of these symmetries are parametric BT's for SDYM. It is thus concluded that the entire point group of SDYM contributes, upon verticalization, BT's to the system.

  6. A general framework for global asymptotic stability analysis of delayed neural networks based on LMI approach

    International Nuclear Information System (INIS)

    Cao Jinde; Ho, Daniel W.C.

    2005-01-01

    In this paper, global asymptotic stability is discussed for neural networks with time-varying delay. Several new criteria in matrix inequality form are given to ascertain the uniqueness and global asymptotic stability of equilibrium point for neural networks with time-varying delay based on Lyapunov method and Linear Matrix Inequality (LMI) technique. The proposed LMI approach has the advantage of considering the difference of neuronal excitatory and inhibitory efforts, which is also computationally efficient as it can be solved numerically using recently developed interior-point algorithm. In addition, the proposed results generalize and improve previous works. The obtained criteria also combine two existing conditions into one generalized condition in matrix form. An illustrative example is also given to demonstrate the effectiveness of the proposed results

  7. Has the general two-Higgs-doublet model unnatural FCNC suppression? An RGE analysis

    International Nuclear Information System (INIS)

    Cvetic, G.; Hwang, S.S.; Kim, C.S.

    1997-01-01

    There is a widespread belief that the general two-Higgs-doublet model (G2HDM) behaves unnaturally with respect to evolution of the flavor-changing neutral Yukawa coupling parameters (FCNYCP's) - i.e., that the latter, although being suppressed at low energies of probes, in general increase by a large factor as the energy of probes increases. We investigate this, by evolving Yukawa parameters by one-loop renormalization group equations and neglecting contributions of the first quark generation. For patterns of FCNYCP suppression at low energies suggested by existing quark mass hierarchies, FCNYCP's remain remarkably stable (suppressed) up to energies very close to the Landau pole. This indicates that G2HDM preserves FCNYCP suppression, for reasonably chosen patterns of that suppression at low energies. (author)

  8. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs*

    Science.gov (United States)

    Eisenhauer, Philipp; Heckman, James J.; Vytlacil, Edward

    2015-01-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college. PMID:26709315

  9. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs.

    Science.gov (United States)

    Eisenhauer, Philipp; Heckman, James J; Vytlacil, Edward

    2015-04-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college.

  10. Reforming the Canadian Sales Tax System: A Regional General Equilibrium Analysis

    OpenAIRE

    CHUN-YAN KUO; BOB HAMILTON

    1991-01-01

    The paper develops a regional general equilibrium model of the Canadian economy to analyze the sectoral and regional impacts of the major changes to the Canadian sales tax system. The results indicate that replacing the federal sales tax with the goods and service tax increases real output in Canada in the long run by 1.4 percent. If the provincial sales taxes are also integrated, real output increases by a further 0.8 percent.

  11. A NSQIP Analysis of MELD and Perioperative Outcomes in General Surgery.

    Science.gov (United States)

    Zielsdorf, Shannon M; Kubasiak, John C; Janssen, Imke; Myers, Jonathan A; Luu, Minh B

    2015-08-01

    It is well known that liver disease has an adverse effect on postoperative outcomes. However, what is still unknown is how to appropriately risk stratify this patient population based on the degree of liver failure. Because data are limited, specifically in general surgery practice, we analyzed the model of end-stage liver disease (MELD) in terms of predicting postoperative complications after one of three general surgery operations: inguinal hernia repair (IHR), umbilical hernia repair (UHR), and colon resection (CRXN). National Surgical Quality Improvement Program data on 17,812 total patients undergoing one of three general surgery operations from 2008 to 2012 were analyzed retrospectively. There were 7402 patients undergoing IHR; 5014 patients undergoing UHR; 5396 patients undergoing CRXN. MELD score was calculated using international normalized ratio, total bilirubin, and creatinine. The primary end point was any postoperative complication. The statistical method used was logistic regression. For IHR, UHR, and CRXN, the overall complication rates were 3.4, 6.4, and 45.9 per cent, respectively. The mean MELD scores were 8.6, 8.5, and 8.5, respectively. For every 1-point increase greater than the mean MELD score, there was a 7.8, 13.8, and 11.6 per cent increase in any postoperative complication. The overall 30-day mortality rate was 0.9 per cent. In conclusion, the MELD score continuum adequately predicts patients' increased risk of postoperative complications after IHR, UHR, and CRXN. Therefore, MELD could be used for preoperative risk stratification and guide clinical decision making for general surgery in the cirrhotic patient.

  12. Aspects of analysis of small-sample right censored data using generalized Wilcoxon rank tests

    OpenAIRE

    Öhman, Marie-Louise

    1994-01-01

    The estimated bias and variance of commonly applied and jackknife variance estimators and observed significance level and power of standardised generalized Wilcoxon linear rank sum test statistics and tests, respectively, of Gehan and Prentice are compared in a Monte Carlo simulation study. The variance estimators are the permutational-, the conditional permutational- and the jackknife variance estimators of the test statistic of Gehan, and the asymptotic- and the jackknife variance estimator...

  13. Impacts of climate change for Swiss winter and summer tourism: a general equilibrium analysis

    OpenAIRE

    Thurm, Boris; Vielle, Marc; Vöhringer, Frank

    2017-01-01

    Tourism could be greatly affected by climate change due to its strong dependence on weather. In Switzerland, the sector represents an appreciable share of the economy. Thus, studying climate effects on tourism is necessary for developing adequate adaptation strategies. While most of the studies focused on winter tourism, we investigate the climate change impacts on both winter and summer tourism in Switzerland. Using a computable general equilibrium (CGE) model, we simulate the impacts of tem...

  14. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  15. Domestic Environmental Policy and International Factor Mobility: A General Equilibrium Analysis

    OpenAIRE

    Stefan Felder; Reto Schleiniger

    1995-01-01

    This paper studies the conditions under which a green tax reform not only benefits the environment but also enhances the efficiency of the tax system. The focus is on the consequences of international factor mobility for the scope of a double dividend. The investigation of the double-dividend claim is based on a general equilibrium model of a stylised small open economy. The simulations of equal-yield tax reform scenarios indicate that an environmental tax on consumption yields a double divid...

  16. A sketch of feeling generalization : a cognitive-existential analysis of psychology

    OpenAIRE

    Zubriki, Tadeus Andrew

    2010-01-01

    Abstract Following a review of literature regarding autobiographical memories, retrieval-induced forgetting, and emotion relative to memory, a theory is devised to find solutions to the questions: How do we conceptualize; what does it mean to conceptualize; and how is memory retrieval possible? Feeling generalization is a universal system of thinking which postulates that which we conceive and retrieve is feeling upon which we conceptualize to conceptions in accord to the moment of arousal...

  17. Analysis on the Implementation of Nutrition Services in Tugurejo General Hospital Semarang

    OpenAIRE

    Dewi, Emy Shinta; Kartasurya, Martha Irene; Sriatmi, Ayun

    2015-01-01

    Nutrition was an important factor for patient care and cure. Results of an evaluation by nutritionalresearch and development unit of Tugurejo district general hospital (RSUD) in 2011 indicated thatfood remains of patient were still below the minimal standard of service. Objective of this study wasto analyze the implementation of nutritional service in the RSUD Tugurejo Semarang.This was a qualitative study with 4 nutritionists, 8 cook assistants, and 8 waitresses as maininformants. Triangulat...

  18. Analysis of a general age-dependent vaccination model for a vertically transmitted disease

    International Nuclear Information System (INIS)

    El Doma, M.

    1995-05-01

    A SIR epidemic model of a general age-dependent vaccination for a vertically as well as horizontally transmitted disease is investigated when the total population is time dependent, and fertility, mortality and removal rates depend on age. We establish the existence and the uniqueness of the solution and obtain the asymptotic behaviour for the solution. For the steady state solution a critical vaccination coverage which will eventually eradicate the disease is determined. (author). 18 refs

  19. The 2013 general elections in Malaysia: An analysis of online news portals

    OpenAIRE

    Kasim, Azahar; Mohd Sani, Mohd Azizuddin

    2016-01-01

    This research analyzed the coverage of online news portals during the election campaign in Malaysia's 13th General Election on 5th May 2013. There were two types of news portals chosen for this research: 1) the mainstream online news portals, namely The Star Online, Berita Harian Online, Bernama Online and Utusan Online; and 2) the alternative news portals consisting of political parties' publications: the Harakah Daily, Roketkini and Keadilan Daily; and the independent news portals of The Ma...

  20. Message Embedded Chaotic Masking Synchronization Scheme Based on the Generalized Lorenz System and Its Security Analysis

    Czech Academy of Sciences Publication Activity Database

    Čelikovský, Sergej; Lynnyk, Volodymyr

    2016-01-01

    Roč. 26, č. 8 (2016), 1650140-1-1650140-15 ISSN 0218-1274 R&D Projects: GA ČR GA13-20433S Institutional support: RVO:67985556 Keywords : Chaotic masking * generalized Lorenz system * message embedded synchronization Subject RIV: BC - Control Systems Theory Impact factor: 1.329, year: 2016 http://library.utia.cas.cz/separaty/2016/TR/celikovsky-0461536.pdf

  1. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  2. ABOUT THE SYSTEM ANALYSIS OF UNEMPLOYMENT OF YOUTH: GENERAL TASKS AND PRIVATE MODELS OF MARKET INDICATORS

    Directory of Open Access Journals (Sweden)

    Natalia V. Kontsevaya

    2016-01-01

    Full Text Available In this work attempt of system approach to the analysis of labor market of youth is made, the place and a role of youth labor exchange are dened, opportunities and methods of state regulation are opened, contradictions in the analysis of the main market indicators are designated.Within system approach to the analysis of dynamics of market processes modeling of the main indicators of labor market in regional scale is shown.This approach can be useful when developing effective and economically reasonable mechanisms of employment of youth, both at the level of regional services of employment, and in the state scale

  3. Cause analysis, prevention, and treatment of postoperative restlessness after general anesthesia in children with cleft palate.

    Science.gov (United States)

    Xu, Hao; Mei, Xiao-Peng; Xu, Li-Xian

    2017-03-01

    Cleft palate is one of the most common congenital malformations of the oral and maxillofacial region, with an incidence rate of around 0.1%. Early surgical repair is the only method for treatment of a cleft lip and palate. However, because of the use of inhalation anesthesia in children and the physiological characteristics of the cleft palate itself combined with the particularities of cleft palate surgery, the incidence rate of postoperative emergence agitation (EA) in cleft palate surgery is significantly higher than in other types of interventions. The exact mechanism of EA is still unclear. Although restlessness after general anesthesia in children with cleft palate is self-limiting, its effects should be considered by clinicians. In this paper, the related literature on restlessness after surgery involving general anesthesia in recent years is summarized. This paper focuses on induction factors as well as prevention and treatment of postoperative restlessness in children with cleft palate after general anesthesia. The corresponding countermeasures to guide clinical practice are also presented in this paper.

  4. Analysis of dental caries using generalized linear and count regression models

    Directory of Open Access Journals (Sweden)

    Javali M. Phil

    2013-11-01

    Full Text Available Generalized linear models (GLM are generalization of linear regression models, which allow fitting regression models to response data in all the sciences especially medical and dental sciences that follow a general exponential family. These are flexible and widely used class of such models that can accommodate response variables. Count data are frequently characterized by overdispersion and excess zeros. Zero-inflated count models provide a parsimonious yet powerful way to model this type of situation. Such models assume that the data are a mixture of two separate data generation processes: one generates only zeros, and the other is either a Poisson or a negative binomial data-generating process. Zero inflated count regression models such as the zero-inflated Poisson (ZIP, zero-inflated negative binomial (ZINB regression models have been used to handle dental caries count data with many zeros. We present an evaluation framework to the suitability of applying the GLM, Poisson, NB, ZIP and ZINB to dental caries data set where the count data may exhibit evidence of many zeros and over-dispersion. Estimation of the model parameters using the method of maximum likelihood is provided. Based on the Vuong test statistic and the goodness of fit measure for dental caries data, the NB and ZINB regression models perform better than other count regression models.

  5. Prevalence of traumatic brain injury in the general adult population: a meta-analysis.

    Science.gov (United States)

    Frost, R Brock; Farrer, Thomas J; Primosch, Mark; Hedges, Dawson W

    2013-01-01

    Traumatic brain injury (TBI) is a significant public-health concern. To understand the extent of TBI, it is important to assess the prevalence of TBI in the general population. However, the prevalence of TBI in the general population can be difficult to measure because of differing definitions of TBI, differing TBI severity levels, and underreporting of sport-related TBI. Additionally, prevalence reports vary from study to study. In this present study, we used meta-analytic methods to estimate the prevalence of TBI in the adult general population. Across 15 studies, all originating from developed countries, which included 25,134 adults, 12% had a history of TBI. Men had more than twice the odds of having had a TBI than did women, suggesting that male gender is a risk factor for TBI. The adverse behavioral, cognitive and psychiatric effects associated with TBI coupled with the high prevalence of TBI identified in this study indicate that TBI is a considerable public and personal-health problem. Copyright © 2012 S. Karger AG, Basel.

  6. Quasi-likelihood generalized linear regression analysis of fatality risk data.

    Science.gov (United States)

    2009-01-01

    Transportation-related fatality risks is a function of many interacting human, vehicle, and environmental factors. Statistically valid analysis of such data is challenged both by the complexity of plausible structural models relating fatality rates t...

  7. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  8. The surgical experience of general surgery residents: an analysis of the applicability of the specialty program in General and Digestive Surgery.

    Science.gov (United States)

    Targarona Soler, Eduardo Ma; Jover Navalon, Jose Ma; Gutierrez Saiz, Javier; Turrado Rodríguez, Víctor; Parrilla Paricio, Pascual

    2015-03-01

    Residents in our country have achieved a homogenous surgical training by following a structured residency program. This is due to the existence of specific training programs for each specialty. The current program, approved in 2007, has a detailed list of procedures that a surgeon should have performed in order to complete training. The aim of this study is to analyze the applicability of the program with regard to the number of procedures performed during the residency period. A data collection form was designed that included the list of procedures from the program of the specialty; it was sent in April 2014 to all hospitals with accredited residency programs. In September 2014 the forms were analysed, and a general descriptive study was performed; a subanalysis according to the resident's sex and Autonomous region was also performed. The number of procedures performed according to the number of residents in the different centers was also analyzed. The survey was sent to 117 hospitals with accredited programs, which included 190 resident places. A total of 91 hospitals responded (53%). The training offered adapts in general to the specialty program. The total number of procedures performed in the different sub-areas, in laparoscopic and emergency surgery is correct or above the number recommended by the program, with the exception of esophageal-gastric and hepatobiliary surgery. The sub-analysis according to Autonomous region did not show any significant differences in the total number of procedures, however, there were significant differences in endocrine surgery (P=.001) and breast surgery (P=.042). A total of 55% of residents are female, with no significant differences in distribution in Autonomous regions. However, female surgeons operate more than their male counterparts during the residency period (512±226 vs. 625±244; P<.01). The number of residents in the hospital correlates with the number of procedures performed; the residents with more procedures

  9. DATE analysis: A general theory of biological change applied to microarray data.

    Science.gov (United States)

    Rasnick, David

    2009-01-01

    In contrast to conventional data mining, which searches for specific subsets of genes (extensive variables) to correlate with specific phenotypes, DATE analysis correlates intensive state variables calculated from the same datasets. At the heart of DATE analysis are two biological equations of state not dependent on genetic pathways. This result distinguishes DATE analysis from other bioinformatics approaches. The dimensionless state variable F quantifies the relative overall cellular activity of test cells compared to well-chosen reference cells. The variable pi(i) is the fold-change in the expression of the ith gene of test cells relative to reference. It is the fraction phi of the genome undergoing differential expression-not the magnitude pi-that controls biological change. The state variable phi is equivalent to the control strength of metabolic control analysis. For tractability, DATE analysis assumes a linear system of enzyme-connected networks and exploits the small average contribution of each cellular component. This approach was validated by reproducible values of the state variables F, RNA index, and phi calculated from random subsets of transcript microarray data. Using published microarray data, F, RNA index, and phi were correlated with: (1) the blood-feeding cycle of the malaria parasite, (2) embryonic development of the fruit fly, (3) temperature adaptation of Killifish, (4) exponential growth of cultured S. pneumoniae, and (5) human cancers. DATE analysis was applied to aCGH data from the great apes. A good example of the power of DATE analysis is its application to genomically unstable cancers, which have been refractory to data mining strategies. 2009 American Institute of Chemical Engineers Biotechnol.

  10. The 100 most-cited papers in general thoracic surgery: A bibliography analysis.

    Science.gov (United States)

    Ding, Hongdou; Song, Xiao; Chen, Linsong; Zheng, Xinlin; Jiang, Gening

    2018-05-01

    The status of citations can reflect the impact of a paper and its contribution to surgical practice. The aim of our study was to identify and review the 100 most-cited papers in general thoracic surgery. Relevant papers on general thoracic surgery were searched through Thomson Reuters Web of Science in the last week of November 2017. Results were returned in descending order of total citations. Their titles and abstracts were reviewed to identify whether they met our inclusion criteria by two thoracic surgeons independently. Characteristics of the first 100 papers, including title, journal name, country, first author, year of publication, total citations, citations in latest 5 years and average citation per year (ACY) were extracted and analyzed. Of the 100 papers, the mean number of citations was 322 with a range from 184 to 921. 19 journals published the papers from 1956 to 2012. Annals of Surgery had the largest number (29), followed by Journal of Thoracic and Cardiovascular Surgery (22) and Annals of Thoracic Surgery (21). The majority of the papers were published in 2000s (48) and originated from United States of America (62). There were 65 retrospective studies, 13 RCTs and 11 prospective studies. Orringer MB and Grillo HC contributed 4 first-author articles respectively. There were 53 papers on esophagus, 36 on lung, 6 on pleura and 5 on trachea. Our study identified the most-cited papers in the past several decades and offered insights into the development and advances of general thoracic surgery. It can help us understand the evidential basis of clinical decision-making today in the area. Copyright © 2018. Published by Elsevier Ltd.

  11. Development of a sensitivity analysis systems in nuclear reactors through generalized perturbation theory at first order in 2 D geometries

    International Nuclear Information System (INIS)

    Garcia, Juan Matias

    2005-01-01

    Perturbation Methods represent a powerful tool to do sensitivity analysis, and they found many aplications in nuclear engineering.As an introduction to this kind of analysis, we develope a program that apply the Generalized Perturbation Theory or GPT Method to bidimensional system of rectangular geometry.We first consider an homogeneous system of non-multiplying material and then an heterogeneous system with region of multiplying material, with the intention of make concret aplications of perturbation method to nuclear engineering problems.The program, that we called Pert, determines neutron fluxes and importance functions applying the Multigroup Diffusion Theory; and also solves the integrals required to calculate sensitivity coefficients.Using this perturbation methods we could verify the low computational cost required to make this kind of analysis and the simplicity of the equations systems involved, allowing us to make elaborates sensitivity analysis for the responses of our interest

  12. Analysis of positron lifetime spectra using quantified maximum entropy and a general linear filter

    International Nuclear Information System (INIS)

    Shukla, A.; Peter, M.; Hoffmann, L.

    1993-01-01

    Two new approaches are used to analyze positron annihilation lifetime spectra. A general linear filter is designed to filter the noise from lifetime data. The quantified maximum entropy method is used to solve the inverse problem of finding the lifetimes and intensities present in data. We determine optimal values of parameters needed for fitting using Bayesian methods. Estimates of errors are provided. We present results on simulated and experimental data with extensive tests to show the utility of this method and compare it with other existing methods. (orig.)

  13. MGF approach to the capacity analysis of Generalized Two-Ray fading models

    KAUST Repository

    Rao, Milind

    2015-09-11

    We propose a class of Generalized Two-Ray (GTR) fading channels that consists of two line of sight (LOS) components with random phase and a diffuse component. Observing that the GTR fading model can be expressed in terms of the underlying Rician distribution, we derive a closed-form expression for the moment generating function (MGF) of the signal-to-noise ratio (SNR) of this model. We then employ this approach to compute the ergodic capacity with receiver side information. The impact of the underlying phase difference between the LOS components on the average SNR of the signal received is also illustrated. © 2015 IEEE.

  14. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    Science.gov (United States)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  15. Cross-channel analysis of quark and gluon generalized parton distributions with helicity flip

    International Nuclear Information System (INIS)

    Pire, B.; Semenov-Tian-Shansky, K.; Szymanowski, L.; Wallon, S.

    2014-01-01

    Quark and gluon helicity flip generalized parton distributions (GPDs) address the transversity quark and gluon structure of the nucleon. In order to construct a theoretically consistent parametrization of these hadronic matrix elements, we work out the set of combinations of those GPDs suitable for the SO(3) partial wave (PW) expansion in the cross-channel. This universal result will help to build up a flexible parametrization of these important hadronic non-perturbative quantities, using, for instance, the approaches based on the conformal PW expansion of GPDs such as the Mellin-Barnes integral or the dual parametrization techniques. (orig.)

  16. Cross-channel analysis of quark and gluon generalized parton distributions with helicity flip

    Energy Technology Data Exchange (ETDEWEB)

    Pire, B. [CNRS, CPhT, Ecole Polytechnique, Palaiseau (France); Semenov-Tian-Shansky, K. [Universite de Liege, IFPA, Departement AGO, Liege (Belgium); Szymanowski, L. [National Centre for Nuclear Research (NCBJ), Warsaw (Poland); Wallon, S. [Universite de Paris-Sud, CNRS, LPT, Orsay (France); Universite Paris 06, Faculte de Physique, UPMC, Paris (France)

    2014-05-15

    Quark and gluon helicity flip generalized parton distributions (GPDs) address the transversity quark and gluon structure of the nucleon. In order to construct a theoretically consistent parametrization of these hadronic matrix elements, we work out the set of combinations of those GPDs suitable for the SO(3) partial wave (PW) expansion in the cross-channel. This universal result will help to build up a flexible parametrization of these important hadronic non-perturbative quantities, using, for instance, the approaches based on the conformal PW expansion of GPDs such as the Mellin-Barnes integral or the dual parametrization techniques. (orig.)

  17. Random matrix analysis of the QCD sign problem for general topology

    International Nuclear Information System (INIS)

    Bloch, Jacques; Wettig, Tilo

    2009-01-01

    Motivated by the important role played by the phase of the fermion determinant in the investigation of the sign problem in lattice QCD at nonzero baryon density, we derive an analytical formula for the average phase factor of the fermion determinant for general topology in the microscopic limit of chiral random matrix theory at nonzero chemical potential, for both the quenched and the unquenched case. The formula is a nontrivial extension of the expression for zero topology derived earlier by Splittorff and Verbaarschot. Our analytical predictions are verified by detailed numerical random matrix simulations of the quenched theory.

  18. MGF approach to the capacity analysis of Generalized Two-Ray fading models

    KAUST Repository

    Rao, Milind; Lopez-Martinez, F. Javier; Alouini, Mohamed-Slim; Goldsmith, Andrea

    2015-01-01

    We propose a class of Generalized Two-Ray (GTR) fading channels that consists of two line of sight (LOS) components with random phase and a diffuse component. Observing that the GTR fading model can be expressed in terms of the underlying Rician distribution, we derive a closed-form expression for the moment generating function (MGF) of the signal-to-noise ratio (SNR) of this model. We then employ this approach to compute the ergodic capacity with receiver side information. The impact of the underlying phase difference between the LOS components on the average SNR of the signal received is also illustrated. © 2015 IEEE.

  19. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  20. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  1. Food advertising in the age of obesity: content analysis of food advertising on general market and african american television.

    Science.gov (United States)

    Henderson, Vani R; Kelly, Bridget

    2005-01-01

    To document the types of foods advertised and weight-related nutritional claims made during advertisements appearing on general market and African American television programming. Content analysis of 553 food advertisements appearing during 101.5 prime-time television hours. Advertisements were classified according to general category (fast-food restaurant, sit-down restaurant, packaged food), specific food type, and the presence of a weight-related nutritional claim. The type of foods advertised and nutritional claims made on general market and African American programs were compared using t and chi-squared tests. More food advertisements appeared during African American programs than general market programs. These advertisements were more likely to be for fast food, candy, soda, or meat and less likely to be for cereals, grains and pasta, fruits and vegetables, dessert, or alcohol. Of all of the food advertisements, 14.9% made a weight-related nutritional claim. More claims related to fat content appeared during African American programming, whereas more light and lean claims appeared in general market advertisements. Practitioners and policy makers should be aware of the prevalence of food advertisements and their potential impact on knowledge and behavior and should consider working more closely with food manufacturers to encourage the creation and promotion of weight-friendly foods. Meanwhile, nutrition educators can help by teaching consumers critical thinking skills as may relate to food advertisements.

  2. Generalized Hyperalgesia in Children and Adults Diagnosed With Hypermobility Syndrome and Ehlers-Danlos Syndrome Hypermobility Type: A Discriminative Analysis.

    Science.gov (United States)

    Scheper, M C; Pacey, V; Rombaut, L; Adams, R D; Tofts, L; Calders, P; Nicholson, L L; Engelbert, R H H

    2017-03-01

    Lowered pressure-pain thresholds have been demonstrated in adults with Ehlers-Danlos syndrome hypermobility type (EDS-HT), but whether these findings are also present in children is unclear. Therefore, the objectives of the study were to determine whether generalized hyperalgesia is present in children with hypermobility syndrome (HMS)/EDS-HT, explore potential differences in pressure-pain thresholds between children and adults with HMS/EDS-HT, and determine the discriminative value of generalized hyperalgesia. Patients were classified in 1 of 3 groups: HMS/EDS-HT, hypermobile (Beighton score ≥4 of 9), and healthy controls. Descriptive data of age, sex, body mass index, Beighton score, skin laxity, and medication usage were collected. Generalized hyperalgesia was quantified by the average pressure-pain thresholds collected from 12 locations. Confounders collected were pain locations/intensity, fatigue, and psychological distress. Comparisons between children with HMS/EDS-HT and normative values, between children and adults with HMS/EDS-HT, and corrected confounders were analyzed with multivariate analysis of covariance. The discriminative value of generalized hyperalgesia employed to differentiate between HMS/EDS-HT, hypermobility, and controls was quantified with logistic regression. Significantly lower pressure-pain thresholds were found in children with HMS/EDS-HT compared to normative values (range -22.0% to -59.0%; P ≤ 0.05). When applying a threshold of 30.8 N/cm 2 for males and 29.0 N/cm 2 for females, the presence of generalized hyperalgesia discriminated between individuals with HMS/EDS-HT, hypermobility, and healthy controls (odds ratio 6.0). Children and adults with HMS/EDS-HT are characterized by hypermobility, chronic pain, and generalized hyperalgesia. The presence of generalized hyperalgesia may indicate involvement of the central nervous system in the development of chronic pain. © 2016, American College of Rheumatology.

  3. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  4. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  5. Generalization of proposed tendon friction correlation and its application to PCCV structural analysis

    International Nuclear Information System (INIS)

    Kashiwase, Takako; Nagasaka, Hideo

    2000-01-01

    The present paper dealt with the extension of tendon friction coefficient correlation as a function of loading end load and circumferential angle, proposed in the former paper. The extended correlation further included the effects of the number of strands contacted with sheath, tendon diameter, politicization of tendon and tendon local curvature. The validity of the correlation was confirmed by several published measured data. The structural analysis of middle cylinder part of 1/4 PCCV (Prestressed Concrete Containment Vessel) model was conducted using the present friction coefficient correlation. The results were compared with the analysis using constant friction coefficient, focused on the tendon tension force distribution. (author)

  6. A dynamic general equilibrium analysis on fostering a hydrogen economy in Korea

    International Nuclear Information System (INIS)

    Bae, Jeong Hwan; Cho, Gyeong-Lyeob

    2010-01-01

    Hydrogen is anticipated to become one of the major alternative energy technologies for a sustainable energy system. This study analyzes the dynamic economic impacts of building a hydrogen economy in Korea employing a dynamic Computable General Equilibrium (CGE) model. As a frontier technology, hydrogen is featured as having a slow diffusion rate due to option value, positive externality, resistance of old technology, and complementary vintages. Without government intervention, hydrogen-derived energy will supply up to 6.5% of final energy demand by 2040. Simulation outcomes show that as price subsidy rates increase by 10%, 20%, and 30%, hydrogen demand will increase by 9.2%, 15.2%, and 37.7%, respectively, of final energy demand by 2040. The output of the transportation sector will increase significantly, while demands for oil and electricity will decline. Demands for coal and LNG will experience little change. Household consumption will decline because of the increase of income taxes. Overall GDP will increase because of the increase in exports and investments. CO 2 emission will decline for medium and high subsidy rate cases, but increase for low subsidy cases. Ultimately, subsidy policy on hydrogen will not be an effective measure for mitigating CO 2 emission in Korea when considering dynamic general equilibrium effects. (author)

  7. Bifurcation analysis on a generalized recurrent neural network with two interconnected three-neuron components

    International Nuclear Information System (INIS)

    Hajihosseini, Amirhossein; Maleki, Farzaneh; Rokni Lamooki, Gholam Reza

    2011-01-01

    Highlights: → We construct a recurrent neural network by generalizing a specific n-neuron network. → Several codimension 1 and 2 bifurcations take place in the newly constructed network. → The newly constructed network has higher capabilities to learn periodic signals. → The normal form theorem is applied to investigate dynamics of the network. → A series of bifurcation diagrams is given to support theoretical results. - Abstract: A class of recurrent neural networks is constructed by generalizing a specific class of n-neuron networks. It is shown that the newly constructed network experiences generic pitchfork and Hopf codimension one bifurcations. It is also proved that the emergence of generic Bogdanov-Takens, pitchfork-Hopf and Hopf-Hopf codimension two, and the degenerate Bogdanov-Takens bifurcation points in the parameter space is possible due to the intersections of codimension one bifurcation curves. The occurrence of bifurcations of higher codimensions significantly increases the capability of the newly constructed recurrent neural network to learn broader families of periodic signals.

  8. Analysis of railroad tank car releases using a generalized binomial model.

    Science.gov (United States)

    Liu, Xiang; Hong, Yili

    2015-11-01

    The United States is experiencing an unprecedented boom in shale oil production, leading to a dramatic growth in petroleum crude oil traffic by rail. In 2014, U.S. railroads carried over 500,000 tank carloads of petroleum crude oil, up from 9500 in 2008 (a 5300% increase). In light of continual growth in crude oil by rail, there is an urgent national need to manage this emerging risk. This need has been underscored in the wake of several recent crude oil release incidents. In contrast to highway transport, which usually involves a tank trailer, a crude oil train can carry a large number of tank cars, having the potential for a large, multiple-tank-car release incident. Previous studies exclusively assumed that railroad tank car releases in the same train accident are mutually independent, thereby estimating the number of tank cars releasing given the total number of tank cars derailed based on a binomial model. This paper specifically accounts for dependent tank car releases within a train accident. We estimate the number of tank cars releasing given the number of tank cars derailed based on a generalized binomial model. The generalized binomial model provides a significantly better description for the empirical tank car accident data through our numerical case study. This research aims to provide a new methodology and new insights regarding the further development of risk management strategies for improving railroad crude oil transportation safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  10. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  11. How to improve mental health competency in general practice training?--a SWOT analysis.

    Science.gov (United States)

    van Marwijk, Harm

    2004-06-01

    It is quite evident there is room for improvement in the primary care management of common mental health problems. Patients respond positively when GPs adopt a more proactive role in this respect. The Dutch general practice curriculum is currently being renewed. The topics discussed here include the Strengths, Weaknesses, Opportunities and Threats (SWOT) of present primary mental healthcare teaching. What works well and what needs improving? Integrated teaching packages are needed to help general practice trainees manage various presentations of psychological distress. Such packages comprise training videotapes, in which models such as problem-solving treatment (PST) are demonstrated, as well as roleplaying material for new skills, self-report questionnaires for patients, and small-group video feedback of consultations. While GP trainees can effectively master such skills, it is important to query the level of proficiency required by registrars. Are these skills of use only to connoisseur GPs, or to all? More room for specialisation and differentiation among trainees may be the way forward. We have just developed a new curriculum for the obligatory three-month psychiatry housemanship. It is competency oriented, self-directed and assignment driven. This new curriculum will be evaluated in due course.

  12. Contributions to sensitivity analysis and generalized discriminant analysis; Contributions a l'analyse de sensibilite et a l'analyse discriminante generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Jacques, J

    2005-12-15

    Two topics are studied in this thesis: sensitivity analysis and generalized discriminant analysis. Global sensitivity analysis of a mathematical model studies how the output variables of this last react to variations of its inputs. The methods based on the study of the variance quantify the part of variance of the response of the model due to each input variable and each subset of input variables. The first subject of this thesis is the impact of a model uncertainty on results of a sensitivity analysis. Two particular forms of uncertainty are studied: that due to a change of the model of reference, and that due to the use of a simplified model with the place of the model of reference. A second problem was studied during this thesis, that of models with correlated inputs. Indeed, classical sensitivity indices not having significance (from an interpretation point of view) in the presence of correlation of the inputs, we propose a multidimensional approach consisting in expressing the sensitivity of the output of the model to groups of correlated variables. Applications in the field of nuclear engineering illustrate this work. Generalized discriminant analysis consists in classifying the individuals of a test sample in groups, by using information contained in a training sample, when these two samples do not come from the same population. This work extends existing methods in a Gaussian context to the case of binary data. An application in public health illustrates the utility of generalized discrimination models thus defined. (author)

  13. Contributions to sensitivity analysis and generalized discriminant analysis; Contributions a l'analyse de sensibilite et a l'analyse discriminante generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Jacques, J

    2005-12-15

    Two topics are studied in this thesis: sensitivity analysis and generalized discriminant analysis. Global sensitivity analysis of a mathematical model studies how the output variables of this last react to variations of its inputs. The methods based on the study of the variance quantify the part of variance of the response of the model due to each input variable and each subset of input variables. The first subject of this thesis is the impact of a model uncertainty on results of a sensitivity analysis. Two particular forms of uncertainty are studied: that due to a change of the model of reference, and that due to the use of a simplified model with the place of the model of reference. A second problem was studied during this thesis, that of models with correlated inputs. Indeed, classical sensitivity indices not having significance (from an interpretation point of view) in the presence of correlation of the inputs, we propose a multidimensional approach consisting in expressing the sensitivity of the output of the model to groups of correlated variables. Applications in the field of nuclear engineering illustrate this work. Generalized discriminant analysis consists in classifying the individuals of a test sample in groups, by using information contained in a training sample, when these two samples do not come from the same population. This work extends existing methods in a Gaussian context to the case of binary data. An application in public health illustrates the utility of generalized discrimination models thus defined. (author)

  14. General aspects of design and vessel nozzle analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Back, N.

    1980-01-01

    Aspects of design and a procedure for nozzle tensile analysis under loads in project, normal and abnormal, emergency, failure and test conditions. For each condition, considerations about the tensile calculation methods, the tensile classification in corresponding categories and the comparison with admissible limits according to the norms. (M.C.K.) [pt

  15. Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.

    Science.gov (United States)

    Miller, James H.; Carr, Sonya C.

    1997-01-01

    Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…

  16. A unified MGF-based capacity analysis of diversity combiners over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2012-01-01

    Unified exact ergodic capacity results for L-branch coherent diversity combiners including equal-gain combining (EGC) and maximal-ratio combining (MRC) are not known. This paper develops a novel generic framework for the capacity analysis of L-branch

  17. Sleep disordered breathing analysis in a general population using standard pulse oximeter signals.

    Science.gov (United States)

    Barak-Shinar, Deganit; Amos, Yariv; Bogan, Richard K

    2013-09-01

    Obstructive sleep apnea reported as the apnea-hypopnea index (AHI) is usually measured in sleep laboratories using a high number of electrodes connected to the patient's body. In this study, we examined the use of a standard pulse oximeter system with an automated analysis based on the photoplethysmograph (PPG) signal for the diagnosis of sleep disordered breathing. Using a standard and simple device with high accuracy might provide a convenient diagnostic or screening solution for patient evaluation at home or in other out of center testing environments. The study included 140 consecutive patients that were referred routinely to a sleep laboratory [SleepMed Inc.] for the diagnosis of sleep disordered breathing. Each patient underwent an overnight polysomnography (PSG) study according to AASM guidelines in an AASM-accredited sleep laboratory. The automatic analysis is based on photoplethysmographic and saturation signals only. Those two signals were recorded for the entire night as part of the full overnight PSG sleep study. The AHI calculated from the PPG analysis is compared to the AHI calculated from the manual scoring gold standard full PSG. The AHI and total respiratory events measured by the pulse oximeter analysis correlated very well with the corresponding results obtained by the gold standard full PSG. The sensitivity and specificity of AHI = or > 5 and 15 levels measured by the analysis are both above 90 %. The sensitivity and positive predictive value for the detection of respiratory event are both above 84 %. The tested system in this study yielded an acceptable result of sleep disordered breathing compared to the gold standard PSG in patients with moderate to severe sleep apnea. Accordingly and given the convenience and simplicity of the standard pulse oximeter device, the new system can be considered suitable for home and ambulatory diagnosis or screening of sleep disordered breathing patients.

  18. Tourism Contribution to Poverty Alleviation in Kenya: A Dynamic Computable General Equilibrium Analysis

    Science.gov (United States)

    Njoya, Eric Tchouamou; Seetaram, Neelu

    2017-01-01

    The aim of this article is to investigate the claim that tourism development can be the engine for poverty reduction in Kenya using a dynamic, microsimulation computable general equilibrium model. The article improves on the common practice in the literature by using the more comprehensive Foster-Greer-Thorbecke (FGT) index to measure poverty instead of headcount ratios only. Simulations results from previous studies confirm that expansion of the tourism industry will benefit different sectors unevenly and will only marginally improve poverty headcount. This is mainly due to the contraction of the agricultural sector caused the appreciation of the real exchange rates. This article demonstrates that the effect on poverty gap and poverty severity is, nevertheless, significant for both rural and urban areas with higher impact in the urban areas. Tourism expansion enables poorer households to move closer to the poverty line. It is concluded that the tourism industry is pro-poor. PMID:29595836

  19. Social incidence and economic costs of carbon limits; A computable general equilibrium analysis for Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, G.; Van Nieuwkoop, R.; Wiedmer, T. (Institute for Applied Microeconomics, Univ. of Bern (Switzerland))

    1992-01-01

    Both distributional and allocational effects of limiting carbon dioxide emissions in a small and open economy are discussed. It starts from the assumption that Switzerland attempts to stabilize its greenhouse gas emissions over the next 25 years, and evaluates costs and benefits of the respective reduction programme. From a methodological viewpoint, it is illustrated how a computable general equilibrium approach can be adopted for identifying economic effects of cutting greenhouse gas emissions on the national level. From a political economy point of view it considers the social incidence of a greenhouse policy. It shows in particular that public acceptance can be increased and economic costs of greenhouse policies can be reduced, if carbon taxes are accompanied by revenue redistribution. 8 tabs., 1 app., 17 refs.

  20. Zero-rating food in South Africa: A computable general equilibrium analysis

    Directory of Open Access Journals (Sweden)

    M Kearney

    2004-04-01

    Full Text Available Zero-rating food is considered to alleviate poverty of poor households who spend the largest proportion of their income on food.  However, this will result in a loss of revenue for government.  A Computable General Equilibrium (CGE model is used to analyze the combined effects on zero-rating food and using alternative revenue sources to compensate for the loss in revenue.  To prohibit excessively high increases in the statutory VAT rates of business and financial services, increasing direct taxes or increasing VAT to 16 per cent, is investigated.  Increasing direct taxes is the most successful option when creating a more progressive tax structure, and still generating a positive impact on GDP.  The results indicate that zero-rating food combined with a proportional percentage increase in direct taxes can improve the welfare of poor households.