WorldWideScience

Sample records for base study analyses

  1. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  2. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  3. [Health risks in different living circumstances of mothers. Analyses based on a population study].

    Science.gov (United States)

    Sperlich, Stefanie

    2014-12-01

    The objective of this study was to determine the living circumstances ('Lebenslagen') in mothers which are associated with elevated health risks. Data were derived from a cross-sectional population based sample of German women (n = 3129) with underage children. By means of a two-step cluster analysis ten different maternal living circumstances were assessed which proved to be distinct with respect to indicators of socioeconomic position, employment status and family-related factors. Out of the ten living circumstances, one could be attributed to higher socioeconomic status (SES), while five were assigned to a middle SES and four to a lower SES. In line with previous findings, mothers with a high SES predominantly showed the best health while mothers with a low SES tended to be at higher health risk with respect to subjective health, mental health (anxiety and depression), obesity and smoking. However, there were important health differences between the different living circumstances within the middle and lower SES. In addition, varying health risks were found among different living circumstances of single mothers, pointing to the significance of family and job-related living conditions in establishing health risks. With this exploratory analysis strategy small-scale living conditions could be detected which were associated with specific health risks. This approach seemed particularly suitable to provide a more precise definition of target groups for health promotion. The findings encourage a more exrensive application of the concept of living conditions in medical sociology research as well as health monitoring.

  4. Analysing a Web-Based E-Commerce Learning Community: A Case Study in Brazil.

    Science.gov (United States)

    Joia, Luiz Antonio

    2002-01-01

    Demonstrates the use of a Web-based participative virtual learning environment for graduate students in Brazil enrolled in an electronic commerce course in a Masters in Business Administration program. Discusses learning communities; computer-supported collaborative work and collaborative learning; influences on student participation; the role of…

  5. A Study of Spectral Integration and Normalization in NMR-based Metabonomic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Lowry, David F.; Jarman, Kristin H.; Harbo, Sam J.; Meng, Quanxin; Fuciarelli, Alfred F.; Pounds, Joel G.; Lee, Monica T.

    2005-09-15

    Metabonomics involves the quantitation of the dynamic multivariate metabolic response of an organism to a pathological event or genetic modification (Nicholson, Lindon and Holmes, 1999). The analysis of these data involves the use of appropriate multivariate statistical methods. Exploratory Data Analysis (EDA) linear projection methods, primarily Principal Component Analysis (PCA), have been documented as a valuable pattern recognition technique for 1H NMR spectral data (Brindle et al., 2002, Potts et al., 2001, Robertson et al., 2000, Robosky et al., 2002). Prior to PCA the raw data is typically processed through four steps; (1) baseline correction, (2) endogenous peak removal, (3) integration over spectral regions to reduce the number of variables, and (4) normalization. The effect of the size of spectral integration regions and normalization has not been well studied. We assess the variability structure and classification accuracy on two distinctly different datasets via PCA and a leave-one-out cross-validation approach under two normalization approaches and an array of spectral integration regions. This study indicates that independent of the normalization method the classification accuracy achieved from metabonomic studies is not highly sensitive to the size of the spectral integration region. Additionally, both datasets scaled to mean zero and unity variance (auto-scaled) has higher variability within classification accuracy over spectral integration window widths than data scaled to the total intensity of the spectrum.

  6. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  7. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  8. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  9. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  10. Limitations of Species Delimitation Based on Phylogenetic Analyses: A Case Study in the Hypogymnia hypotrypa Group (Parmeliaceae, Ascomycota.

    Directory of Open Access Journals (Sweden)

    Xinli Wei

    Full Text Available Delimiting species boundaries among closely related lineages often requires a range of independent data sets and analytical approaches. Similar to other organismal groups, robust species circumscriptions in fungi are increasingly investigated within an empirical framework. Here we attempt to delimit species boundaries in a closely related clade of lichen-forming fungi endemic to Asia, the Hypogymnia hypotrypa group (Parmeliaceae. In the current classification, the Hypogymnia hypotrypa group includes two species: H. hypotrypa and H. flavida, which are separated based on distinctive reproductive modes, the former producing soredia but absent in the latter. We reexamined the relationship between these two species using phenotypic characters and molecular sequence data (ITS, GPD, and MCM7 sequences to address species boundaries in this group. In addition to morphological investigations, we used Bayesian clustering to identify potential genetic groups in the H. hypotrypa/H. flavida clade. We also used a variety of empirical, sequence-based species delimitation approaches, including: the "Automatic Barcode Gap Discovery" (ABGD, the Poisson tree process model (PTP, the General Mixed Yule Coalescent (GMYC, and the multispecies coalescent approach BPP. Different species delimitation scenarios were compared using Bayes factors delimitation analysis, in addition to comparisons of pairwise genetic distances, pairwise fixation indices (FST. The majority of the species delimitation analyses implemented in this study failed to support H. hypotrypa and H. flavida as distinct lineages, as did the Bayesian clustering analysis. However, strong support for the evolutionary independence of H. hypotrypa and H. flavida was inferred using BPP and further supported by Bayes factor delimitation. In spite of rigorous morphological comparisons and a wide range of sequence-based approaches to delimit species, species boundaries in the H. hypotrypa group remain uncertain

  11. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  12. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  13. PC based uranium enrichment analyser

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishana, K.R.; Bairi, B.R.

    1991-01-01

    It is important to measure enrichment of unirradiated nuclear fuel elements during production as a quality control measure. An IBM PC based system has recently been tested for enrichment measurements for Nuclear Fuel Complex (NFC), Hyderabad. As required by NFC, the system has ease of calibration. It is easy to switch the system from measuring enrichment of fuel elements to pellets and also automatically store the data and the results. The system uses an IBM PC plug in card to acquire data. The card incorporates programmable interval timers (8253-5). The counter/timer devices are executed by I/O mapped I/O's. A novel algorithm has been incorporated to make the system more reliable. The application software has been written in BASIC. (author). 9 refs., 1 fig

  14. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  15. Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study

    NARCIS (Netherlands)

    Vale, C.L.; Rydzewska, L.H.; Rovers, M.M.; Emberson, J.R.; Gueyffier, F.; Stewart, L.A.

    2015-01-01

    OBJECTIVE: To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines. DESIGN: Descriptive study. SETTING: Database maintained by the Cochrane IPD Meta-analysis

  16. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    Directory of Open Access Journals (Sweden)

    Han Bossier

    2018-01-01

    Full Text Available Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1 the balance between false and true positives and (2 the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS, or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35. To do this, we apply a resampling scheme on a large dataset (N = 1,400 to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results.

  17. Suicidality and aggression during antidepressant treatment: systematic review and meta-analyses based on clinical study reports.

    Science.gov (United States)

    Sharma, Tarang; Guski, Louise Schow; Freund, Nanna; Gøtzsche, Peter C

    2016-01-27

    To study serious harms associated with selective serotonin and serotonin-norepinephrine reuptake inhibitors.Design Systematic review and meta-analysis. Mortality and suicidality. Secondary outcomes were aggressive behaviour and akathisia. Clinical study reports for duloxetine, fluoxetine, paroxetine, sertraline, and venlafaxine obtained from the European and UK drug regulators, and summary trial reports for duloxetine and fluoxetine from Eli Lilly's website. Double blind placebo controlled trials that contained any patient narratives or individual patient listings of harms. Two researchers extracted data independently; the outcomes were meta-analysed by Peto's exact method (fixed effect model). We included 70 trials (64,381 pages of clinical study reports) with 18,526 patients. These trials had limitations in the study design and discrepancies in reporting, which may have led to serious under-reporting of harms. For example, some outcomes appeared only in individual patient listings in appendices, which we had for only 32 trials, and we did not have case report forms for any of the trials. Differences in mortality (all deaths were in adults, odds ratio 1.28, 95% confidence interval 0.40 to 4.06), suicidality (1.21, 0.84 to 1.74), and akathisia (2.04, 0.93 to 4.48) were not significant, whereas patients taking antidepressants displayed more aggressive behaviour (1.93, 1.26 to 2.95). For adults, the odds ratios were 0.81 (0.51 to 1.28) for suicidality, 1.09 (0.55 to 2.14) for aggression, and 2.00 (0.79 to 5.04) for akathisia. The corresponding values for children and adolescents were 2.39 (1.31 to 4.33), 2.79 (1.62 to 4.81), and 2.15 (0.48 to 9.65). In the summary trial reports on Eli Lilly's website, almost all deaths were noted, but all suicidal ideation events were missing, and the information on the remaining outcomes was incomplete. Because of the shortcomings identified and having only partial access to appendices with no access to case report forms, the harms

  18. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  19. The effect of English-language restriction on systematic review-based meta-analyses: a systematic review of empirical studies.

    Science.gov (United States)

    Morrison, Andra; Polisena, Julie; Husereau, Don; Moulton, Kristen; Clark, Michelle; Fiander, Michelle; Mierzwinski-Urban, Monika; Clifford, Tammy; Hutton, Brian; Rabb, Danielle

    2012-04-01

    The English language is generally perceived to be the universal language of science. However, the exclusive reliance on English-language studies may not represent all of the evidence. Excluding languages other than English (LOE) may introduce a language bias and lead to erroneous conclusions. We conducted a comprehensive literature search using bibliographic databases and grey literature sources. Studies were eligible for inclusion if they measured the effect of excluding randomized controlled trials (RCTs) reported in LOE from systematic review-based meta-analyses (SR/MA) for one or more outcomes. None of the included studies found major differences between summary treatment effects in English-language restricted meta-analyses and LOE-inclusive meta-analyses. Findings differed about the methodological and reporting quality of trials reported in LOE. The precision of pooled estimates improved with the inclusion of LOE trials. Overall, we found no evidence of a systematic bias from the use of language restrictions in systematic review-based meta-analyses in conventional medicine. Further research is needed to determine the impact of language restriction on systematic reviews in particular fields of medicine.

  20. Increased risk of stroke in hypertensive women using hormone therapy: analyses based on the Danish Nurse Study

    DEFF Research Database (Denmark)

    Løkkegaard, Ellen; Jovanovic, Zorana; Heitmann, Berit L

    2003-01-01

    by presence of risk factors for stroke. DESIGN: Prospective cohort study. SETTING: In 1993, the Danish Nurse Study was established, and questionnaires on lifestyle and HT use were sent to all Danish nurses older than 44 years, of whom 19,898 (85.8%) replied. PARTICIPANTS: Postmenopausal women (n = 13...

  1. Working at the Nexus of Generic and Content-Specific Teaching Practices: An Exploratory Study Based on TIMSS Secondary Analyses

    Science.gov (United States)

    Charalambous, Charalambos Y.; Kyriakides, Ermis

    2017-01-01

    For years scholars have attended to either generic or content-specific teaching practices attempting to understand instructional quality and its effects on student learning. Drawing on the TIMSS 2007 and 2011 databases, this exploratory study empirically tests the hypothesis that attending to both types of practices can help better explain student…

  2. Is the Relationship between Common Mental Disorder and Adiposity Bidirectional? Prospective Analyses of a UK General Population-Based Study.

    Science.gov (United States)

    Fezeu, Léopold K; Batty, G David; Batty, David G; Gale, Catharine R; Kivimaki, Mika; Hercberg, Serge; Czernichow, Sebastien

    2015-01-01

    The direction of the association between mental health and adiposity is poorly understood. Our objective was to empirically examine this link in a UK study. This is a prospective cohort study of 3 388 people (men) aged ≥ 18 years at study induction who participated in both the UK Health and Lifestyle Survey at baseline (HALS-1, 1984/1985) and the re-survey (HALS-2, 1991/1992). At both survey examinations, body mass index, waist circumference and self-reported common mental disorder (the 30-item General Health Questionnaire, GHQ) were measured. Logistic regression models were used to compute odds ratios (OR) and accompanying 95% confidence intervals (CI) for the associations between (1) baseline common mental disorder (QHQ score > 4) and subsequent general and abdominal obesity and (2) baseline general and abdominal obesity and re-survey common mental disorders. After controlling for a range of covariates, participants with common mental disorder at baseline experienced greater odds of subsequently becoming overweight (women, OR: 1.30, 1.03 - 1.64; men, 1.05, 0.81 - 1.38) and obese (women, 1.26, 0.82 - 1.94; men, OR: 2.10, 1.23 - 3.55) than those who were free of common mental disorder. Similarly, having baseline common mental health disorder was also related to a greater risk of developing moderate (1.57, 1.21 - 2.04) and severe (1.48, 1.09 - 2.01) abdominal obesity (women only). Baseline general or abdominal obesity was not associated with the risk of future common mental disorder. These findings of the present study suggest that the direction of association between common mental disorders and adiposity is from common mental disorder to increased future risk of adiposity as opposed to the converse.

  3. Load time functions (LTF) for large commercial aircraft based on Riera approach and finite element analyses – a parametric study

    International Nuclear Information System (INIS)

    Iliev, A.

    2013-01-01

    Conclusions: In cases of a complex geometry of the target structure a careful evaluation of the predefined load time function should be performed. Special attention should be paid on different deceleration of airplane parts and their equivalent load forces. In cases of cylindrical structures with relatively small diameter (in comparison to the airplane wing spread), the impact of the engines should be investigated separately. When auxiliary structures are surrounding the reactor containment, the impact load will be reduced due to initial destruction of part of the airplane in the surrounding auxiliary structures. For the case study, this reduction was found to be non-significant. However if important equipment is situated in surrounding auxiliary buildings, engines may provide higher equivalent forces compared to normal planar target structure

  4. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  5. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  6. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  7. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  8. Radiobiological analyse based on cell cluster models

    International Nuclear Information System (INIS)

    Lin Hui; Jing Jia; Meng Damin; Xu Yuanying; Xu Liangfeng

    2010-01-01

    The influence of cell cluster dimension on EUD and TCP for targeted radionuclide therapy was studied using the radiobiological method. The radiobiological features of tumor with activity-lack in core were evaluated and analyzed by associating EUD, TCP and SF.The results show that EUD will increase with the increase of tumor dimension under the activity homogeneous distribution. If the extra-cellular activity was taken into consideration, the EUD will increase 47%. Under the activity-lack in tumor center and the requirement of TCP=0.90, the α cross-fire influence of 211 At could make up the maximum(48 μm)3 activity-lack for Nucleus source, but(72 μm)3 for Cytoplasm, Cell Surface, Cell and Voxel sources. In clinic,the physician could prefer the suggested dose of Cell Surface source in case of the future of local tumor control for under-dose. Generally TCP could well exhibit the effect difference between under-dose and due-dose, but not between due-dose and over-dose, which makes TCP more suitable for the therapy plan choice. EUD could well exhibit the difference between different models and activity distributions,which makes it more suitable for the research work. When the user uses EUD to study the influence of activity inhomogeneous distribution, one should keep the consistency of the configuration and volume of the former and the latter models. (authors)

  9. A STUDY TO ANALYSE THE EFFICACY OF MODIFIED PILATES BASED EXERCISES AND THERAPEUTIC EXERCISES IN INDIVIDUALS WITH CHRONIC NON SPECIFIC LOW BACK PAIN: A RANDOMIZED CONTROLLED TRAIL

    OpenAIRE

    U.Albert Anand,; P.Mariet Caroline,; B.Arun,; G.Lakshmi Gomathi

    2014-01-01

    Background: Chronic low back pain is an expensive and difficult condition to treat. Low back pain is the most common musculoskeletal symptoms seen in 85 % of individuals in their life time. One of the interventions widely used by physiotherapists in the treatment of chronic non-specific low back pain (CNLBP) is exercise therapy based upon the Pilates principles. Objective: The purpose of the study was to find out the effect of Modified Pilates based exercises for patients with ...

  10. Energy, exergy and sustainability analyses of hybrid renewable energy based hydrogen and electricity production and storage systems: Modeling and case study

    International Nuclear Information System (INIS)

    Caliskan, Hakan; Dincer, Ibrahim; Hepbasli, Arif

    2013-01-01

    In this study, hybrid renewable energy based hydrogen and electricity production and storage systems are conceptually modeled and analyzed in detail through energy, exergy and sustainability approaches. Several subsystems, namely hybrid geothermal energy-wind turbine-solar photovoltaic (PV) panel, inverter, electrolyzer, hydrogen storage system, Proton Exchange Membrane Fuel Cell (PEMFC), battery and loading system are considered. Also, a case study, based on hybrid wind–solar renewable energy system, is conducted and its results are presented. In addition, the dead state temperatures are considered as 0 °C, 10 °C, 20 °C and 30 °C, while the environment temperature is 30 °C. The maximum efficiencies of the wind turbine, solar PV panel, electrolyzer, PEMFC are calculated as 26.15%, 9.06%, 53.55%, and 33.06% through energy analysis, and 71.70%, 9.74%, 53.60%, and 33.02% through exergy analysis, respectively. Also, the overall exergy efficiency, ranging from 5.838% to 5.865%, is directly proportional to the dead state temperature and becomes higher than the corresponding energy efficiency of 3.44% for the entire system. -- Highlights: ► Developing a three-hybrid renewable energy (geothermal–wind–solar)-based system. ► Undertaking a parametric study at various dead state temperatures. ► Investigating the effect of dead state temperatures on exergy efficiency

  11. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  12. Mid-Adolescent Predictors of Adult Drinking Levels in Early Adulthood and Gender Differences: Longitudinal Analyses Based on the South Australian School Leavers Study

    Directory of Open Access Journals (Sweden)

    Paul H. Delfabbro

    2016-01-01

    Full Text Available There is considerable public health interest in understanding what factors during adolescence predict longer-term drinking patterns in adulthood. The aim of this study was to examine gender differences in the age 15 social and psychological predictors of less healthy drinking patterns in early adulthood. The study investigates the relative importance of internalising problems, other risky health behaviours, and peer relationships after controlling for family background characteristics. A sample of 812 young people who provided complete alcohol consumption data from the age of 15 to 20 years (5 measurement points were drawn from South Australian secondary schools and given a detailed survey concerning their psychological and social wellbeing. Respondents were classified into two groups based upon a percentile division: those who drank at levels consistently below NHMRC guidelines and those who consistently drank at higher levels. The results showed that poorer age 15 scores on measures of psychological wellbeing including scores on the GHQ-12, self-esteem, and life-satisfaction as well as engagement in health-related behaviours such as smoking or drug-taking were associated with higher drinking levels in early adulthood. The pattern of results was generally similar for both genders. Higher drinking levels were most strongly associated with smoking and marijuana use and poorer psychological wellbeing during adolescence.

  13. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  14. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  15. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  16. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-01-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  17. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  18. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  19. Connecting micro and macro: bringing case-studies and model-based approaches together in analysing patterns of vulnerability to global environmental change

    NARCIS (Netherlands)

    Dijk, van J.W.M.

    2012-01-01

    The objective of this project is to build bridges between quantitative system dynamic simulation models that are developed at PBL (IMAGE/GISMO) and qualitative case-studies by attempting to upscale lessons learned from local case-studies through Qualitative Comparative Analysis (QCA) and by

  20. Multivariate analyses to assess the effects of surgeon and hospital volume on cancer survival rates: a nationwide population-based study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Chun-Ming Chang

    Full Text Available BACKGROUND: Positive results between caseloads and outcomes have been validated in several procedures and cancer treatments. However, there is limited information available on the combined effects of surgeon and hospital caseloads. We used nationwide population-based data to explore the association between surgeon and hospital caseloads and survival rates for major cancers. METHODOLOGY: A total of 11,677 patients with incident cancer diagnosed in 2002 were identified from the Taiwan National Health Insurance Research Database. Survival analysis, the Cox proportional hazards model, and propensity scores were used to assess the relationship between 5-year survival rates and different caseload combinations. RESULTS: Based on the Cox proportional hazard model, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer survival rates, and hazard ratios ranged from 1.3 in head and neck cancer to 1.8 in lung cancer after adjusting for patients' demographic variables, co-morbidities, and treatment modality. When analyzed using the propensity scores, the adjusted 5-year survival rates were poorer for patients treated by low-volume surgeons in low-volume hospitals, compared to those treated by high-volume surgeons in high-volume hospitals (P<0.005. CONCLUSIONS: After adjusting for differences in the case mix, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer 5-year survival rates. Payers may implement quality care improvement in low-volume surgeons.

  1. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  2. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  3. Delineating Potential Karst Water-Bearing Structures based on Resistivity Anomalies and Microtremor Analyses-A Case Study in Yunnan Province, China

    Science.gov (United States)

    Gan, F.; Su, C.; Liu, W.; Zhao, W.

    2016-12-01

    Heterogeneity, anisotropy and rugged landforms become challenges for geophysicists to locate drilling site by water-bearing structure profiling in Karst region. If only one geophysical method is used to achieve this objective, low resistivity anomalies deduced to be water-rich zones could actually be zones rich in marl and shale. In this study, integrated geophysical methods were used to locate a favorable drilling position for the provision of karst water to Juede village, which had been experiencing severe water shortages over a prolonged period. According to site conditions and hydrogeological data, appropriate geophysical profiles were conducted, approximately perpendicular to the direction of groundwater flow. In general, significant changes in resistivity occur between water-filled caves/ fractures and competent rocks. Thus, electrical and electromagnetic methods have been widely applied to search for karst groundwater indirectly. First, electrical resistivity tomography was carried out to discern shallow resistivity distributions within the profile where the low resistivity anomalies were of most interest. Second, one short profile of audio-frequency magnetotelluric survey was used to ascertain the vertical and horizontal extent of these low resistivity anomalies. Third, the microtremor H/V spectral ratio method was applied to identify potential water-bearing structures from low resistivity anomalies and to differentiate these from the interference of marl and shale with low resistivity. Finally, anomalous depths were estimated by interpreting Schlumberger sounding data to determine an optimal drilling site. The study shows that karst hydrogeology and geophysical methods can be effectively integrated for the purposes of karst groundwater exploration.

  4. Developments based on stochastic and determinist methods for studying complex nuclear systems; Developpements utilisant des methodes stochastiques et deterministes pour l'analyse de systemes nucleaires complexes

    Energy Technology Data Exchange (ETDEWEB)

    Giffard, F.X

    2000-05-19

    In the field of reactor and fuel cycle physics, particle transport plays and important role. Neutronic design, operation and evaluation calculations of nuclear system make use of large and powerful computer codes. However, current limitations in terms of computer resources make it necessary to introduce simplifications and approximations in order to keep calculation time and cost within reasonable limits. Two different types of methods are available in these codes. The first one is the deterministic method, which is applicable in most practical cases but requires approximations. The other method is the Monte Carlo method, which does not make these approximations but which generally requires exceedingly long running times. The main motivation of this work is to investigate the possibility of a combined use of the two methods in such a way as to retain their advantages while avoiding their drawbacks. Our work has mainly focused on the speed-up of 3-D continuous energy Monte Carlo calculations (TRIPOLI-4 code) by means of an optimized biasing scheme derived from importance maps obtained from the deterministic code ERANOS. The application of this method to two different practical shielding-type problems has demonstrated its efficiency: speed-up factors of 100 have been reached. In addition, the method offers the advantage of being easily implemented as it is not very to the choice of the importance mesh grid. It has also been demonstrated that significant speed-ups can be achieved by this method in the case of coupled neutron-gamma transport problems, provided that the interdependence of the neutron and photon importance maps is taken into account. Complementary studies are necessary to tackle a problem brought out by this work, namely undesirable jumps in the Monte Carlo variance estimates. (author)

  5. Developments based on stochastic and determinist methods for studying complex nuclear systems; Developpements utilisant des methodes stochastiques et deterministes pour l'analyse de systemes nucleaires complexes

    Energy Technology Data Exchange (ETDEWEB)

    Giffard, F X

    2000-05-19

    In the field of reactor and fuel cycle physics, particle transport plays and important role. Neutronic design, operation and evaluation calculations of nuclear system make use of large and powerful computer codes. However, current limitations in terms of computer resources make it necessary to introduce simplifications and approximations in order to keep calculation time and cost within reasonable limits. Two different types of methods are available in these codes. The first one is the deterministic method, which is applicable in most practical cases but requires approximations. The other method is the Monte Carlo method, which does not make these approximations but which generally requires exceedingly long running times. The main motivation of this work is to investigate the possibility of a combined use of the two methods in such a way as to retain their advantages while avoiding their drawbacks. Our work has mainly focused on the speed-up of 3-D continuous energy Monte Carlo calculations (TRIPOLI-4 code) by means of an optimized biasing scheme derived from importance maps obtained from the deterministic code ERANOS. The application of this method to two different practical shielding-type problems has demonstrated its efficiency: speed-up factors of 100 have been reached. In addition, the method offers the advantage of being easily implemented as it is not very to the choice of the importance mesh grid. It has also been demonstrated that significant speed-ups can be achieved by this method in the case of coupled neutron-gamma transport problems, provided that the interdependence of the neutron and photon importance maps is taken into account. Complementary studies are necessary to tackle a problem brought out by this work, namely undesirable jumps in the Monte Carlo variance estimates. (author)

  6. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  7. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  8. Methods for analysing cardiovascular studies with repeated measures

    NARCIS (Netherlands)

    Cleophas, T. J.; Zwinderman, A. H.; van Ouwerkerk, B. M.

    2009-01-01

    Background. Repeated measurements in a single subject are generally more similar than unrepeated measurements in different subjects. Unrepeated analyses of repeated data cause underestimation of the treatment effects. Objective. To review methods adequate for the analysis of cardiovascular studies

  9. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  10. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  11. Seismic risk analyses in the German Risk Study, phase B

    International Nuclear Information System (INIS)

    Hosser, D.; Liemersdorf, H.

    1991-01-01

    The paper discusses some aspects of the seismic risk part of the German Risk Study for Nuclear Power Plants, Phase B. First simplified analyses in Phase A of the study allowed only a rough classification of structures and systems of the PWR reference plant according to their seismic risk contribution. These studies were extended in Phase B using improved models for the dynamic analyses of buildings, structures and components as well as for the probabilistic analyses of seismic loading, failure probabilities and event trees. The methodology of deriving probabilistic seismic load descriptions is explained and compared with the methods in Phase A of the study and in other studies. Some details of the linear and nonlinear dynamic analyses of structures are reported in order to demonstrate the influence of different assumptions for material behaviour and failure criteria. The probabilistic structural and event tree analyses are discussed with respect to distribution assumptions, acceptable simplifications and model uncertainties. Some results for the PWR reference plant are given. (orig.)

  12. Trajectory data analyses for pedestrian space-time activity study.

    Science.gov (United States)

    Qi, Feng; Du, Fei

    2013-02-25

    It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission(1-3). An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data(4). Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling. The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an

  13. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  14. Conducting Meta-Analyses Based on p Values

    Science.gov (United States)

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  15. PC based 8K multichannel analyser for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Jain, S.K.; Gupta, J.D.; Suman Kumari, B.

    1989-01-01

    An IBM-PC based 8K multichannel analyser(MCA) has been developed which incorporates all the features of an advanced system like very high throughput for data acquisition in PHA as well as MCS modes, fast real-time display, extensive display manipulation facilities, various present controls and concurrent data processing. The compact system hardware consists of a 2 bit wide NIM module and a PC add-on card. Because of external acquisition hardware, the system after initial programming by PC can acquire data independently allowing the PC to be switched off. To attain very high throughput, the most desirable feature of an MCA, a dual-port memory architecture has been used. The asymmetric dual-port RAM, housed in the NIM module offers 24 bit parallel access to the ADC and 8 bit wide access to PC which results in fast real-time histogramic display on the monitor. PC emulation software is menu driven and user friendly. It integrates a comprehensive set of commonly required application routines for concurrent data processing. After the transfer of know-how to the Electronic Corporation of India Ltd. (ECIL), this system is bein g produced at ECIL. (author). 5 refs., 4 figs

  16. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  17. RELAP5 analyses and support of Oconee-1 PTS studies

    International Nuclear Information System (INIS)

    Charlton, T.R.

    1983-01-01

    The integrity of a reactor vessel during a severe overcooling transient with primary system pressurization is a current safety concern and has been identified as an Unresolved Safety Issue(USI) A-49 by the US Nuclear Regulatory Commission (NRC). Resolution of USI A-49, denoted as Pressurized Thermal Shock (PTS), is being examined by the US NRC sponsored PTS integration study. In support of this study, the Idaho National Engineering Laboratory (INEL) has performed RELAP5/MOD1.5 thermal-hydraulic analyses of selected overcooling transients. These transient analyses were performed for the Oconee-1 pressurized water reactor (PWR), which is Babcock and Wilcox designed nuclear steam supply system

  18. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses; Effets de l'age et du genre sur la perfusion cerebrale regionale etudiee par deux methodes d'analyse statistique voxel-par-voxel

    Energy Technology Data Exchange (ETDEWEB)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T. [Universite Catholique de Louvain, Service de Medecine Nucleaire, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); Van Laere, K. [Leuven Univ. Hospital, Nuclear Medicine Div. (Belgium); Jamart, J. [Universite Catholique de Louvain, Dept. de Biostatistiques, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); D' Asseler, Y. [Ghent Univ., Medical Signal and Image Processing Dept. (MEDISIP), Faculty of applied sciences (Belgium); Minoshima, S. [Washington Univ., Dept. of Radiology, Seattle (United States)

    2009-10-15

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine {sup 99m}Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  19. Monte Carlo parameter studies and uncertainty analyses with MCNP5

    International Nuclear Information System (INIS)

    Brown, F. B.; Sweezy, J. E.; Hayes, R.

    2004-01-01

    A software tool called mcnp p study has been developed to automate the setup, execution, and collection of results from a series of MCNP5 Monte Carlo calculations. This tool provides a convenient means of performing parameter studies, total uncertainty analyses, parallel job execution on clusters, stochastic geometry modeling, and other types of calculations where a series of MCNP5 jobs must be performed with varying problem input specifications. (authors)

  20. The Network for Analysing Longitudinal Population-based HIV/AIDS data on Africa (ALPHA): Data on mortality, by HIV status and stage on the HIV care continuum, among the general population in seven longitudinal studies between 1989 and 2014.

    Science.gov (United States)

    Slaymaker, Emma; McLean, Estelle; Wringe, Alison; Calvert, Clara; Marston, Milly; Reniers, Georges; Kabudula, Chodziwadziwa Whiteson; Crampin, Amelia; Price, Alison; Michael, Denna; Urassa, Mark; Kwaro, Daniel; Sewe, Maquins; Eaton, Jeffrey W; Rhead, Rebecca; Nakiyingi-Miiro, Jessica; Lutalo, Tom; Nabukalu, Dorean; Herbst, Kobus; Hosegood, Victoria; Zaba, Basia

    2017-11-06

    Timely progression of people living with HIV (PLHIV) from the point of infection through the pathway from diagnosis to treatment is important in ensuring effective care and treatment of HIV and preventing HIV-related deaths and onwards transmission of infection.  Reliable, population-based estimates of new infections are difficult to obtain for the generalised epidemics in sub-Saharan Africa.  Mortality data indicate disease burden and, if disaggregated along the continuum from diagnosis to treatment, can also reflect the coverage and quality of different HIV services.  Neither routine statistics nor observational clinical studies can estimate mortality prior to linkage to care nor following disengagement from care.  For this, population-based data are required. The Network for Analysing Longitudinal Population-based HIV/AIDS data on Africa brings together studies in Kenya, Malawi, South Africa, Tanzania, Uganda, and Zimbabwe.  Eight studies have the necessary data to estimate mortality by HIV status, and seven can estimate mortality at different stages of the HIV care continuum.  This data note describes a harmonised dataset containing anonymised individual-level information on survival by HIV status for adults aged 15 and above. Among PLHIV, the dataset provides information on survival during different periods: prior to diagnosis of infection; following diagnosis but before linkage to care; in pre-antiretroviral treatment (ART) care; in the first six months after ART initiation; among people continuously on ART for 6+ months; and among people who have ever interrupted ART.

  1. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  2. Economic evaluation of algae biodiesel based on meta-analyses

    Science.gov (United States)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  3. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  4. Orbitrap-based mass analyser for in-situ characterization of asteroids: ILMA, Ion Laser Mass Analyser

    Science.gov (United States)

    Briois, C.; Cotti, H.; Thirkell, L.; Space Orbitrap Consortium[K. Aradj, French; Bouabdellah, A.; Boukrara, A.; Carrasco, N.; Chalumeau, G.; Chapelon, O.; Colin, F.; Coll, P.; Engrand, C.; Grand, N.; Kukui, A.; Lebreton, J.-P.; Pennanech, C.; Szopa, C.; Thissen, R.; Vuitton, V.; Zapf], P.; Makarov, A.

    2014-07-01

    Since about a decade the boundaries between comets and carbonaceous asteroids are fading [1,2]. No doubt that the Rosetta mission should bring a new wealth of data on the composition of comets. But as promising as it may look, the mass resolving power of the mass spectrometers onboard (so far the best on a space mission) will only be able to partially account for the diversity of chemical structures present. ILMA (Ion-Laser Mass Analyser) is a new generation high mass resolution LDI-MS (Laser Desorption-Ionization Mass Spectrometer) instrument concept using the Orbitrap technique, which has been developed in the frame of the two Marco Polo & Marco Polo-R proposals to the ESA Cosmic Vision program. Flagged by ESA as an instrument concept of interest for the mission in 2012, it has been under study for a few years in the frame of a Research and Technology (R&T) development programme between 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) [3,4], partly funded by the French Space Agency (CNES). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercialises Orbitrap-based laboratory instruments. The R&T activities are currently concentrating on the core elements of the Orbitrap analyser that are required to reach a sufficient maturity level for allowing design studies of future space instruments. A prototype is under development at LPC2E and a mass resolution (m/Δm FWHM) of 100,000 as been obtained at m/z = 150 for a background pressure of 10^{-8} mbar. ILMA would be a key instrument to measure the molecular, elemental and isotopic composition of objects such as carbonaceous asteroids, comets, or other bodies devoid of atmosphere such as the surface of an icy satellite, the Moon, or Mercury.

  5. Analysing Leontiev Tube Capabilities in the Space-based Plants

    Directory of Open Access Journals (Sweden)

    N. L. Shchegolev

    2017-01-01

    Full Text Available The paper presents a review of publications dedicated to the gas-dynamic temperature stratification device (the Leontief tube and shows main factors affecting its efficiency. Describes an experimental installation, which is used to obtain data on the value of energy separation in the air to prove this device the operability.The assumption that there is an optimal relationship between the flow velocities in the subsonic and supersonic channels of the gas-dynamic temperature stratification device is experimentally confirmed.The paper conducts analysis of possible ways to raise the efficiency of power plants of various (including space basing, and shows that, currently, a mainstream of increasing efficiency of their operation is to complicate design solutions.A scheme of the closed gas-turbine space-based plant using a mixture of inert gases (helium-xenon one for operation is proposed. What differs it from the simplest variants is a lack of the cooler-radiator and integration into gas-dynamic temperature stratification device and heat compressor.Based on the equations of one-dimensional gas dynamics, it is shown that the total pressure restorability when removing heat in a thermal compressor determines operating capability of this scheme. The exploratory study of creating a heat compressor is performed, and it is shown that when operating on gases with a Prandtl number close to 1, the total pressure does not increase.The operating capability conditions of the heat compressor are operation on gases with a low value of the Prandtl number (helium-xenon mixture at high supersonic velocities and with a longitudinal pressure gradient available.It is shown that there is a region of the low values of the Prandtl number (Pr <0.3 for which, with the longitudinal pressure gradient available in the supersonic flows of a viscous gas, the total pressure can be restored.

  6. Agreement between the results of meta-analyses from case reports and from clinical studies regarding the efficacy of laronidase therapy in patients with mucopolysaccharidosis type I who initiated enzyme replacement therapy in adult age: An example of case reports meta-analyses as an useful tool for evidence-based medicine in rare diseases.

    Science.gov (United States)

    Sampayo-Cordero, Miguel; Miguel-Huguet, Bernat; Pardo-Mateos, Almudena; Moltó-Abad, Marc; Muñoz-Delgado, Cecilia; Pérez-López, Jordi

    2018-02-01

    Case reports might have a prominent role in the rare diseases field, due to the small number of patients affected by one such disease. A previous systematic review regarding the efficacy of laronidase therapy in patients with mucopolysaccharidosis type I (MPS-I) who initiated enzyme replacement therapy (ERT) in adult age has been published. The review included a meta-analysis of 19 clinical studies and the description of eleven case reports. It was of interest to perform a meta-analysis of those case reports to explore the role of such meta-analyses as a tool for evidence-based medicine in rare diseases. The study included all case reports with standard treatment regimen. Primary analysis was the percentage of case reports showing an improvement in a specific outcome. Only when that percentage was statistically higher than 5%, the improvement was confirmed as such. The outcomes that accomplished this criterion were ranked and compared to the GRADE criteria obtained by those same outcomes in the previous meta-analysis of clinical studies. There were three outcomes that had a significant improvement: Urine glycosaminoglycans, liver volume and 6-minute walking test. Positive and negative predictive values, sensitivity and specificity for the results of the meta-analysis of case reports as compared to that of clinical studies were 100%, 88.9%, 75% and 100%, respectively. Accordingly, absolute (Rho=0.82, 95%CI: 0.47 to 0.95) and relative agreement (Kappa=0.79, 95%CI: 0.593 to 0.99) between the number of case reports with improvement in a specific outcome and the GRADE evidence score for that outcome were good. Sensitivity analysis showed that agreement between the meta-analysis of case reports and that of the clinical studies were good only when using a strong confirmatory strategy for outcome improvement in case reports. We found an agreement between the results of meta-analyses from case reports and from clinical studies in the efficacy of laronidase therapy in

  7. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  8. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  9. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  10. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  11. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  12. Evaluation of an optoacoustic based gas analysing device

    Science.gov (United States)

    Markmann, Janine; Lange, Birgit; Theisen-Kunde, Dirk; Danicke, Veit; Mayorov, Fedor; Eckert, Sebastian; Kettmann, Pascal; Brinkmann, Ralf

    2017-07-01

    The relative occurrence of volatile organic compounds in the human respiratory gas is disease-specific (ppb range). A prototype of a gas analysing device using two tuneable laser systems, an OPO-laser (2.5 to 10 μm) and a CO2-laser (9 to 11 μm), and an optoacoustic measurement cell was developed to detect concentrations in the ppb range. The sensitivity and resolution of the system was determined by test gas measurements, measuring ethylene and sulfur hexafluoride with the CO2-laser and butane with the OPO-laser. System sensitivity found to be 13 ppb for sulfur hexafluoride, 17 ppb for ethylene and Respiratory gas samples of 8 healthy volunteers were investigated by irradiation with 17 laser lines of the CO2-laser. Several of those lines overlap with strong absorption bands of ammonia. As it is known that ammonia concentration increases by age a separation of people 35 was striven for. To evaluate the data the first seven gas samples were used to train a discriminant analysis algorithm. The eighth subject was then assigned correctly to the group >35 years with the age of 49 years.

  13. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  14. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  15. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  16. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  17. PCR and RFLP analyses based on the ribosomal protein operon

    Science.gov (United States)

    Differentiation and classification of phytoplasmas have been primarily based on the highly conserved 16Sr RNA gene. RFLP analysis of 16Sr RNA gene sequences has identified 31 16Sr RNA (16Sr) groups and more than 100 16Sr subgroups. Classification of phytoplasma strains can however, become more refin...

  18. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Complementary Exploratory and Confirmatory Factor Analyses of the French WISC-V: Analyses Based on the Standardization Sample.

    Science.gov (United States)

    Lecerf, Thierry; Canivez, Gary L

    2017-12-28

    Interpretation of the French Wechsler Intelligence Scale for Children-Fifth Edition (French WISC-V; Wechsler, 2016a) is based on a 5-factor model including Verbal Comprehension (VC), Visual Spatial (VS), Fluid Reasoning (FR), Working Memory (WM), and Processing Speed (PS). Evidence for the French WISC-V factorial structure was established exclusively through confirmatory factor analyses (CFAs). However, as recommended by Carroll (1995); Reise (2012), and Brown (2015), factorial structure should derive from both exploratory factor analysis (EFA) and CFA. The first goal of this study was to examine the factorial structure of the French WISC-V using EFA. The 15 French WISC-V primary and secondary subtest scaled scores intercorrelation matrix was used and factor extraction criteria suggested from 1 to 4 factors. To disentangle the contribution of first- and second-order factors, the Schmid and Leiman (1957) orthogonalization transformation (SLT) was applied. Overall, no EFA evidence for 5 factors was found. Results indicated that the g factor accounted for about 67% of the common variance and that the contributions of the first-order factors were weak (3.6 to 11.9%). CFA was used to test numerous alternative models. Results indicated that bifactor models produced better fit to these data than higher-order models. Consistent with previous studies, findings suggested dominance of the general intelligence factor and that users should thus emphasize the Full Scale IQ (FSIQ) when interpreting the French WISC-V. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Road user behaviour analyses based on video detections

    DEFF Research Database (Denmark)

    Agerholm, Niels; Tønning, Charlotte; Madsen, Tanja Kidholm Osmann

    2017-01-01

    To a large extent, traffic safety improvements rely on reliable and full-covering accident registration. This is difficult to obtain in practice. Hence, surrogate measures as traffic conflict studies can contribute with more information. To make these studies more efficient, a software called RUBA...... and user-friendliness, it is the conclusion that, at present, the RUBA software assists a number of traffic behaviour studies more efficiently and reliably than what is obtainable by human observers....

  1. Associations of indoor carbon dioxide concentrations, VOCS, environmental susceptibilities with mucous membrane and lower respiratory sick building syndrome symptoms in the BASE study: Analyses of the 100 building dataset

    Energy Technology Data Exchange (ETDEWEB)

    Apte, M.G.; Erdmann, C.A.

    2002-10-01

    Using the 100 office-building Building Assessment Survey and Evaluation (BASE) Study dataset, we performed multivariate logistic regression analyses to quantify the associations between indoor minus outdoor CO{sub 2} (dCO{sub 2}) concentrations and mucous membrane (MM) and lower respiratory system (Lresp) Sick Building Syndrome (SBS) symptoms, adjusting for age, sex, smoking status, presence of carpet in workspace, thermal exposure, relative humidity, and a marker for entrained automobile exhaust. Using principal components analysis we identified a number of possible sources of 73 measured volatile organic compounds in the office buildings, and assessed the impact of these VOCs on the probability of presenting the SBS symptoms. Additionally we included analysis adjusting for the risks for predisposition of having SBS symptoms associated with the allergic, asthmatic, and environmentally sensitive subpopulations within the office buildings. Adjusted odds ratios (ORs) for statistically significant, dose-dependant associations (p<0.05) for dry eyes, sore throat, nose/sinus congestion, and wheeze symptoms with 100-ppm increases in dCO{sub 2} ranged from 1.1 to 1.2. These results suggest that increases in the ventilation rates per person among typical office buildings will, on average significantly reduce the prevalence of several SBS symptoms, up to 80%, even when these buildings meet the existing ASHRAE ventilation standards for office buildings. VOC sources were observed to play an role in direct association with mucous membrane and lower respiratory irritation, and possibly to be indirectly involved in indoor chemical reactions with ozone that produce irritating compounds associated with SBS symptoms. O-xylene, possibly emitted from furniture coatings was associated with shortness of breath (OR at the maximum concentration = 8, p < 0.05). The environmental sensitivities of a large subset of the office building population add to the overall risk of SBS symptoms (ORs

  2. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    Science.gov (United States)

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  3. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  4. Design of the storage location based on the ABC analyses

    Science.gov (United States)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  5. DNA-energetics-based analyses suggest additional genes in ...

    Indian Academy of Sciences (India)

    2012-06-25

    Jun 25, 2012 ... sequence with its homologs in the annotated databases using alignment ... in predictions and on the development of next-generation prediction servers ... sequences, but were not annotated in the organism studied. ...... Biopolymers 52 29–56 .... improvement for identifying translation initiation sites in micro-.

  6. A Bibliography of Generative-Based Grammatical Analyses of Spanish.

    Science.gov (United States)

    Nuessel, Frank H.

    One hundred sixty-eight books, articles, and dissertations written between 1960 and 1973 are listed in this bibliography of linguistic studies of the Spanish language within the grammatical theory originated by Noam Chomsky in his "Syntactic Structures" (1957). The present work is divided into two general categories: (1) phonology and (2) syntax…

  7. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  8. CVD diamond Brewster window: feasibility study by FEM analyses

    Directory of Open Access Journals (Sweden)

    Vaccaro A.

    2012-09-01

    Full Text Available Chemical vapor deposition (CVD diamond windows are a crucial component in heating and current drive (H&CD applications. In order to minimize the amount of reflected power from the diamond disc, its thickness must match the desired beam wavelength, thus proper targeting of the plasma requires movable beam reflectors. This is the case, for instance, of the ITER electron cyclotron H&CD system. However, looking at DEMO, the higher heat loads and neutron fluxes could make the use of movable parts close to the plasma difficult. The issue might be solved by using gyrotrons able to tune the beam frequency to the desired resonance, but this concept requires transmission windows that work in a given frequency range, such as the Brewster window. It consists of a CVD diamond disc brazed to two copper cuffs at the Brewster angle. The brazing process is carried out at about 800°C and then the temperature is decreased down to room temperature. Diamond and copper have very different thermal expansion coefficients, therefore high stresses build up during the cool down phase that might lead to failure of the disc. Considering also the complex geometry of the window with the skewed position of the disc, analyses are required in the first place to check its feasibility. The cool down phase was simulated by FEM structural analyses for several geometric and constraint configurations of the window. A study of indirect cooling of the window by water was also performed considering a HE11 mode beam. The results are here reported.

  9. Evidence for Endothermy in Pterosaurs Based on Flight Capability Analyses

    Science.gov (United States)

    Jenkins, H. S.; Pratson, L. F.

    2005-12-01

    Previous attempts to constrain flight capability in pterosaurs have relied heavily on the fossil record, using bone articulation and apparent muscle allocation to evaluate flight potential (Frey et al., 1997; Padian, 1983; Bramwell, 1974). However, broad definitions of the physical parameters necessary for flight in pterosaurs remain loosely defined and few systematic approaches to constraining flight capability have been synthesized (Templin, 2000; Padian, 1983). Here we present a new method to assess flight capability in pterosaurs as a function of humerus length and flight velocity. By creating an energy-balance model to evaluate the power required for flight against the power available to the animal, we derive a `U'-shaped power curve and infer optimal flight speeds and maximal wingspan lengths for pterosaurs Quetzalcoatlus northropi and Pteranodon ingens. Our model corroborates empirically derived power curves for the modern black-billed magpie ( Pica Pica) and accurately reproduces the mechanical power curve for modern cockatiels ( Nymphicus hollandicus) (Tobalske et al., 2003). When we adjust our model to include an endothermic metabolic rate for pterosaurs, we find a maximal wingspan length of 18 meters for Q. northropi. Model runs using an exothermic metabolism derive maximal wingspans of 6-8 meters. As estimates based on fossil evidence show total wingspan lengths reaching up to 15 meters for Q. northropi, we conclude that large pterosaurs may have been endothermic and therefore more metabolically similar to birds than to reptiles.

  10. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    Science.gov (United States)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-10-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  11. Phylogenetic trait-based analyses of ecological networks.

    Science.gov (United States)

    Rafferty, Nicole E; Ives, Anthony R

    2013-10-01

    Ecological networks of two interacting guilds of species, such as flowering plants and pollinators, are common in nature, and studying their structure can yield insights into their resilience to environmental disturbances. Here we develop analytical methods for exploring the strengths of interactions within bipartite networks consisting of two guilds of phylogenetically related species. We then apply these methods to investigate the resilience of a plant-pollinator community to anticipated climate change. The methods allow the statistical assessment of, for example, whether closely related pollinators are more likely to visit plants with similar relative frequencies, and whether closely related pollinators tend to visit closely related plants. The methods can also incorporate trait information, allowing us to identify which plant traits are likely responsible for attracting different pollinators. These questions are important for our study of 14 prairie plants and their 22 insect pollinators. Over the last 70 years, six of the plants have advanced their flowering, while eight have not. When we experimentally forced earlier flowering times, five of the six advanced-flowering species experienced higher pollinator visitation rates, whereas only one of the eight other species had more visits; this network thus appears resilient to climate change, because those species with advanced flowering have ample pollinators earlier in the season. Using the methods developed here, we show that advanced-flowering plants did not have a distinct pollinator community from the other eight species. Furthermore, pollinator phylogeny did not explain pollinator community composition; closely related pollinators were not more likely to visit the same plant species. However, differences among pollinator communities visiting different plants were explained by plant height, floral color, and symmetry. As a result, closely related plants attracted similar numbers of pollinators. By parsing out

  12. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  13. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  14. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  15. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  16. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A; Leppaemaeki, E; Koponen, P; Levander, J; Tapola, E [VTT Energy, Espoo (Finland). Energy Production Technologies

    1998-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  17. Studies and Analyses of Vulnerabilities in Aided Adversarial Decision Making

    National Research Council Canada - National Science Library

    Llinas, James

    1998-01-01

    .... The aid" in the analysis (i.e., an automated decision aid) focuses upon a generic data fusion processor that estimates situation and threat states based on multisensor/multisource-based data assessments...

  18. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  19. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  20. CrusView: a Java-based visualization platform for comparative genomics analyses in Brassicaceae species.

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-09-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/.

  1. Study of thermal-hydraulic analyses with CIP method

    International Nuclear Information System (INIS)

    Doi, Yoshihiro

    1996-09-01

    New type of numerical scheme CIP has been proposed for solving hyperbolic type equations and the CIP is focused on as a less numerical diffusive scheme. C-CUP method with the CIP scheme is adopted to numerical simulations that treat compressible and incompressible fluids, phase change phenomena and Mixture fluids. To evaluate applicabilities of the CIP scheme and C-CUP method for thermal hydraulic analyses related to Fast Breeder Reactors (FBRs), the scheme and the method were reviewed. Feature of the CIP scheme and procedure of the C-CUP method were presented. The CIP scheme is used to solve linear hyperbolic type equations for advection term in basic equations of fluids. Key issues of the scheme is that profile between grid points is described to solve the equation by cubic polynomial and spatial derivatives of the polynomial. The scheme can capture steep change of solution and suppress numerical error. In the C-CUP method, the basic equations of fluids are divided into advection terms and the other terms. The advection terms is solved with CIP scheme and the other terms is solved with difference method. The C-CUP method is robust for numerical instability, but mass of fluid will be in unfair preservation with nonconservative equations for fluids. Numerical analyses with the CIP scheme and the C-CUP method has been performed for phase change, mixture and moving object. These analyses are depend on characteristics of that the scheme and the method are robust for steep change of density and useful for interface tracking. (author)

  2. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  3. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  4. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos

    2011-01-01

    present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case......Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...

  5. Liver volume, intrahepatic fat and body weight in the course of a lifestyle interventional study. Analysis with quantitative MR-based methods; Lebervolumen, Leberfettanteil und Koerpergewicht im Verlauf einer Lebensstilinterventionsstudie. Eine Analyse mit quantitativen MR-basierten Methoden

    Energy Technology Data Exchange (ETDEWEB)

    Bongers, M.N. [Klinikum der Eberhard-Karls-Universitaet Tuebingen, Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Stefan, N.; Fritsche, A.; Haering, H.U. [Universitaetsklinikum Tuebingen, Innere Medizin IV - Endokrinologie und Diabetologie, Angiologie, Nephrologie und Klinische Chemie, Tuebingen (Germany); Helmholtz-Zentrum Muenchen an der Universitaet Tuebingen, Institut fuer Diabetes-Forschung und Metabolische Erkrankungen (IDM), Tuebingen (Germany); Nikolaou, K. [Klinikum der Eberhard-Karls-Universitaet Tuebingen, Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Schick, F. [Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Machann, J. [Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Helmholtz-Zentrum Muenchen an der Universitaet Tuebingen, Institut fuer Diabetes-Forschung und Metabolische Erkrankungen (IDM), Tuebingen (Germany); Deutsches Zentrum fuer Diabetesforschung (DZD), Neuherberg (Germany)

    2015-04-01

    The aim of this study was to investigate potential associations between changes in liver volume, the amount of intrahepatic lipids (IHL) and body weight during lifestyle interventions. In a prospective study 150 patients with an increased risk for developing type 2 diabetes mellitus were included who followed a caloric restriction diet for 6 months. In the retrospective analysis 18 women and 9 men (age range 22-71 years) with an average body mass index (BMI) of 32 kg/m{sup 2} were enrolled. The liver volume was determined at the beginning and after 6 months by three-dimensional magnetic resonance imaging (3D-MRI, echo gradient, opposed-phase) and IHLs were quantified by volume-selective MR spectroscopy in single voxel stimulated echo acquisition mode (STEAM). Univariable and multivariable correlation analyses between changes of liver volume (Δliver volume), intrahepatic lipids (ΔIHL) and body weight (ΔBW) were performed. Univariable correlation analysis in the whole study cohort showed associations between ΔIHL and ΔBW (r = 0.69; p < 0.0001), ΔIHL and Δliver volume (r = 0.66; p = 0.0002) as well as ΔBW and Δliver volume (r = 0.5; p = 0.0073). Multivariable correlation analysis revealed that changes of liver volume are primarily determined by changes in IHL independent of changes in body weight (β = 0.0272; 95 % CI: 0.0155-0.034; p < 0.0001). Changes of liver volume during lifestyle interventions are independent of changes of body weight primarily determined by changes of IHL. These results show the reversibility of augmented liver volume in steatosis if it is possible to reduce IHLs during lifestyle interventions. (orig.) [German] Lassen sich Zusammenhaenge zwischen den Aenderungen des Lebervolumens, des Anteils intrahepatischer Lipide und des Koerpergewichts waehrend einer Lebensstilintervention feststellen ?In einer prospektiven Interventionsstudie unterzogen sich 150 Probanden mit erhoehtem Diabetesrisiko fuer 6 Monate einer diaetetischen

  6. Hormone Receptor Expression Analyses in Neoplastic and Non-Neoplastic Canine Mammary Tissue by a Bead Based Multiplex Branched DNA Assay: A Gene Expression Study in Fresh Frozen and Formalin-Fixed, Paraffin-Embedded Samples.

    Directory of Open Access Journals (Sweden)

    Annika Mohr

    Full Text Available Immunohistochemistry (IHC is currently considered the method of choice for steroid hormone receptor status evaluation in human breast cancer and, therefore, it is commonly utilized for assessing canine mammary tumors. In case of low hormone receptor expression, IHC is limited and thus is complemented by molecular analyses. In the present study, a multiplex bDNA assay was evaluated as a method for hormone receptor gene expression detection in canine mammary tissues. Estrogen receptor (ESR1, progesterone receptor (PGR, prolactin receptor (PRLR and growth hormone receptor (GHR gene expressions were evaluated in neoplastic and non-neoplastic canine mammary tissues. A set of 119 fresh frozen and 180 formalin-fixed, paraffin-embedded (FFPE was comparatively analyzed and used for assay evaluation. Furthermore, a possible association between the hormone receptor expression in different histological subtypes of canine malignant mammary tumors and the castration status, breed and invasive growth of the tumor were analyzed. The multiplex bDNA assay proved to be more sensitive for fresh frozen specimens. Hormone receptor expression found was significantly decreased in malignant mammary tumors in comparison to non-neoplastic tissue and benign mammary tumors. Among the histological subtypes the lowest gene expression levels of ESR1, PGR and PRLR were found in solid, anaplastic and ductal carcinomas. In summary, the evaluation showed that the measurement of hormone receptors with the multiplex bDNA assay represents a practicable method for obtaining detailed quantitative information about gene expression in canine mammary tissue for future studies. Still, comparison with IHC or quantitative real-time PCR is needed for further validation of the present method.

  7. An efficient method for studying and analysing the propagation ...

    African Journals Online (AJOL)

    The paper describes a method, based on the solution of travelling-wave phenomena in polyphase systems by the use of matrix methods, of deriving the basic matrices of the conductor system taking into account the effect of conductor geometry, conductor internal impedance and the earth-return path. It is then shown how ...

  8. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-01-01

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  9. How distributed processing produces false negatives in voxel-based lesion-deficit analyses.

    Science.gov (United States)

    Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J

    2018-07-01

    In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients

  10. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  11. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    Science.gov (United States)

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  12. Limitations and risks of meta-analyses of longevity studies

    DEFF Research Database (Denmark)

    Sebastiani, Paola; Bae, Harold; Gurinovich, Anastasia

    2017-01-01

    Searching for genetic determinants of human longevity has been challenged by the rarity of data sets with large numbers of individuals who have reached extreme old age, inconsistent definitions of the phenotype, and the difficulty of defining appropriate controls. Meta-analysis - a statistical...... method to summarize results from different studies - has become a common tool in genetic epidemiology to accrue large sample sizes for powerful genetic association studies. In conducting a meta-analysis of studies of human longevity however, particular attention must be made to the definition of cases...... and controls (including their health status) and on the effect of possible confounders such as sex and ethnicity upon the genetic effect to be estimated. We will show examples of how a meta-analysis can inflate the false negative rates of genetic association studies or it can bias estimates of the association...

  13. 1995 and 1996 Upper Three Runs Dye Study Data Analyses

    International Nuclear Information System (INIS)

    Chen, K.F.

    1998-06-01

    This report presents an analysis of dye tracer studies conducted on Upper Three Runs. The revised STREAM code was used to analyze these studies and derive a stream velocity and a dispersion coefficient for use in aqueous transport models. These models will be used to facilitate the establishment of aqueous effluent limits and provide contaminant transport information to emergency management in the event of a release

  14. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  15. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  16. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  17. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  18. Individual-based analyses reveal limited functional overlap in a coral reef fish community.

    Science.gov (United States)

    Brandl, Simon J; Bellwood, David R

    2014-05-01

    Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on

  19. Study of a bubble chamber's pictures automatic analyser: Coccinelle

    International Nuclear Information System (INIS)

    Jaeger, J.-J.

    1974-01-01

    The automatic scanning and measuring system ''Coccinelle'' built in the Laboratoire de Physique Corpusculaire of College de France, is specially made for the pictures of the new large bubble chambers like BEBC and Mirabelle. This device uses the spot of a high precision cathode ray tube for the analysis of the picture according to a scanning window. The signal of a photomultiplier located behind the picture gives, after processing, the useful information in the form of coordinates of the analyzed points. Electronics, connected to a computer, generates the movements of the spot, the backward and forward movement of the films, and gives the right information for the programs of geometrical reconstruction of the tracks. The use of the device is based upon the collaboration of a programmed automaton and a human operator that have conversational facilities: TV display, light-pen and function keyboard. Beyond a detailed description of the whole device, this thesis insists especially on the following electronic parts: sweep generation of the spot, photomultiplier signal processing [fr

  20. Analysing chemical equilibrium conditions when studying butyl acetate synthesis

    OpenAIRE

    Álvaro Orjuela Londoño; Fernando Leiva Lenis; Luis Alejandro Boyacá Mendivelso; Gerardo Rodríguez Niño; Luis María Carballo Suárez

    2010-01-01

    This work studied the liquid phase of acetic acid and butyl alcohol esterification reaction (P atm = 560 mmHg),using an ion exchange resin (Lewatit K-2431) as catalyst. A set of assays were carried out for determining the effect of catalyst load, temperature and molar ratio (acid/alcohol) on chemical equilibrium constant. Components’ selective sorption on the resin matrix was noticed; its effect on equilibrium conditions was verified, by using different acid/alcohol starting ratios. A non-ide...

  1. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  2. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  3. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    Science.gov (United States)

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  4. Politicized Civil Society in Bangladesh: Case Study Analyses

    Directory of Open Access Journals (Sweden)

    Farhat Tasnim

    2017-03-01

    Full Text Available Although civil society in Bangladesh is recognized for its vibrant performance in social development, it is often criticized for its inability to ensure good governance and democracy. The aim of this paper is to point out the reasons for this failure of civil society. Through performing case studies upon five civil society organizations representing different sector and level of the civil society, the paper concludes that civil society organizations in Bangladesh are often politicized and co-opted by different political parties. In a typical scenario, civil society can provide a counterbalance or even monitor the state both at the national and local level. However, in Bangladesh, often the civil society organizations have compromised their autonomy and politicized themselves to certain political parties or political block. In such a vulnerable position, civil society can hardly play its expected role to ensure good governance and strengthen democracy.

  5. Three new hydrochlorothiazide cocrystals: Structural analyses and solubility studies

    Science.gov (United States)

    Ranjan, Subham; Devarapalli, Ramesh; Kundu, Sudeshna; Vangala, Venu R.; Ghosh, Animesh; Reddy, C. Malla

    2017-04-01

    Hydrochlorothiazide (HCT) is a diuretic BCS class IV drug with poor aqueous solubility and low permeability leading to poor oral absorption. The present work explores the cocrystallization technique to enhance the aqueous solubility of HCT. Three new cocrystals of HCT with water soluble coformers phenazine (PHEN), 4-dimethylaminopyridine (DMAP) and picolinamide (PICA) were prepared successfully by solution crystallization method and characterized by single crystal X-ray diffraction (SCXRD), powder X-ray diffraction (PXRD), fourier transform -infraredspectroscopy (FT-IR), differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA). Structural characterization revealed that the cocrystals with PHEN, DMAP and PICA exists in P21/n, P21/c and P21/n space groups, respectively. The improved solubility of HCT-DMAP (4 fold) and HCT-PHEN (1.4 fold) cocrystals whereas decreased solubility of HCT-PICA (0.5 fold) as compared to the free drug were determined after 4 h in phosphate buffer, pH 7.4, at 25 °C by using shaking flask method. HCT-DMAP showed a significant increase in solubility than all previously reported cocrystals of HCT suggest the role of a coformer. The study demonstrates that the selection of coformer could have pronounced impact on the physicochemical properties of HCT and cocrystallization can be a promising approach to improve aqueous solubility of drugs.

  6. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  7. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies

    DEFF Research Database (Denmark)

    Gyöngyösi, Mariann; Wojakowski, Wojciech; Navarese, Eliano P

    2016-01-01

    In contrast to multiple publication-based meta-analyses involving clinical cardiac regeneration therapy in patients with recent myocardial infarction, a recently published meta-analysis based on individual patient data reported no effect of cell therapy on left ventricular function or clinical...

  8. Exacerbation heterogeneity in COPD: subgroup analyses from the FLAME study

    Directory of Open Access Journals (Sweden)

    Vogelmeier CF

    2018-04-01

    Full Text Available Claus F Vogelmeier,1 Kenneth R Chapman,2 Marc Miravitlles,3 Nicolas Roche,4 Jørgen Vestbo,5 Chau Thach,6 Donald Banerji,6 Robert Fogel,6 Francesco Patalano,7 Petter Olsson,8 Konstantinos Kostikas,7 Jadwiga A Wedzicha9 1Member of the German Center for Lung Research (DZL, Department of Medicine, Pulmonary and Critical Care Medicine, University Medical Center Giessen and Marburg, Philipps-Universität Marburg, Marburg, Germany; 2Asthma and Airway Centre, University Health Network and University of Toronto, Toronto, ON, Canada; 3Pneumology Department, Hospital Universitari Vall d’Hebron, CIBER de Enfermedades Respiratorias (CIBERES, Barcelona, Spain; 4Service de Pneumologie AP-HP, Cochin Hospital, University Paris Descartes (EA2511, Paris, France; 5Institute of Infection, Immunity and Respiratory Medicine, The University of Manchester and Manchester University NHS Foundation Trust, Manchester, UK; 6Novartis Pharmaceuticals Corporation, East Hanover, NJ, USA; 7Novartis Pharma AG, Basel, Switzerland; 8Novartis Sverige AB, Täby, Sweden; 9National Heart and Lung Institute, Imperial College London, London, UK Background: The FLAME study compared once-daily indacaterol/glycopyrronium (IND/GLY 110/50 µg with twice-daily salmeterol/fluticasone (SFC 50/500 µg in symptomatic patients with moderate to very severe COPD and a history of exacerbations in the previous year. Methods: This prespecified and post hoc subgroup analysis evaluated treatment efficacy on 1 moderate/severe exacerbations according to prior exacerbation history and treatment, and 2 types of exacerbations according to health care resource utilization (HCRU during 1-year follow-up. Results: IND/GLY reduced the rate of moderate/severe exacerbations versus SFC in patients with a history of 1 exacerbation (rate ratio [RR]: 0.83, 95% CI: 0.75–0.93, ≥2 exacerbations (RR: 0.85, 95% CI: 0.70–1.03 and ≥2 exacerbations or ≥1 hospitalization in the previous year (RR: 0.86, 95% CI: 0.74

  9. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer : Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    NARCIS (Netherlands)

    Khankari, Nikhil K.; Shu, Xiao Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Eeles, Rosalind A.; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei; Blalock, Kendra; Campbell, Peter T.; Casey, Graham; Conti, David V.; Edlund, Christopher K.; Figueiredo, Jane; James Gauderman, W.; Gong, Jian; Green, Roger C.; Harju, John F.; Harrison, Tabitha A.; Jacobs, Eric J.; Jenkins, Mark A.; Jiao, Shuo; Li, Li; Lin, Yi; Manion, Frank J.; Moreno, Victor; Mukherjee, Bhramar; Raskin, Leon; Schumacher, Fredrick R.; Seminara, Daniela; Severi, Gianluca; Stenzel, Stephanie L.; Thomas, Duncan C.; Hopper, John L.; Southey, Melissa C.; Makalic, Enes; Schmidt, Daniel F.; Fletcher, Olivia; Peto, Julian; Gibson, Lorna; dos Santos Silva, Isabel; Ahsan, Habib; Whittemore, Alice; Waisfisz, Quinten; Meijers-Heijboer, Hanne; Adank, Muriel; van der Luijt, Rob B.; Uitterlinden, Andre G.; Hofman, Albert; Meindl, Alfons; Schmutzler, Rita K.; Müller-Myhsok, Bertram; Lichtner, Peter; Nevanlinna, Heli; Muranen, Taru A.; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Hein, Rebecca; Dahmen, Norbert; Beckman, Lars; Crisponi, Laura; Hall, Per; Czene, Kamila; Irwanto, Astrid; Liu, Jianjun; Easton, Douglas F.; Turnbull, Clare; Rahman, Nazneen; Eeles, Rosalind; Kote-Jarai, Zsofia; Muir, Kenneth; Giles, Graham; Neal, David; Donovan, Jenny L.; Hamdy, Freddie C.; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher; Schumacher, Fred; Travis, Ruth; Riboli, Elio; Hunter, David; Gapstur, Susan; Berndt, Sonja; Chanock, Stephen; Han, Younghun; Su, Li; Wei, Yongyue; Hung, Rayjean J.; Brhane, Yonathan; McLaughlin, John; Brennan, Paul; McKay, James D.; Rosenberger, Albert; Houlston, Richard S.; Caporaso, Neil; Teresa Landi, Maria; Heinrich, Joachim; Wu, Xifeng; Ye, Yuanqing; Christiani, David C.

    2016-01-01

    Background: Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using

  10. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  11. Chemometrical characterization of four italian rice varieties based on genetic and chemical analyses.

    Science.gov (United States)

    Brandolini, Vincenzo; Coïsson, Jean Daniel; Tedeschi, Paola; Barile, Daniela; Cereti, Elisabetta; Maietti, Annalisa; Vecchiati, Giorgio; Martelli, Aldo; Arlorio, Marco

    2006-12-27

    This paper describes a method for achieving qualitative identification of four rice varieties from two different Italian regions. To estimate the presence of genetic diversity among the four rice varieties, we used polymerase chain reaction-randomly amplified polymorphic DNA (PCR-RAPD) markers, and to elucidate whether a relationship exists between the ground and the specific characteristics of the product, we studied proximate composition, fatty acid composition, mineral content, and total antioxidant capacity. Using principal component analysis on genomic and compositional data, we were able to classify rice samples according to their variety and their district of production. This work also examined the discrimination ability of different parameters. It was found that genomic data give the best discrimination based on varieties, indicating that RAPD assays could be useful in discriminating among closely related species, while compositional analyses do not depend on the genetic characters only but are related to the production area.

  12. Net Energy, CO2 Emission and Land-Based Cost-Benefit Analyses of Jatropha Biodiesel: A Case Study of the Panzhihua Region of Sichuan Province in China

    Directory of Open Access Journals (Sweden)

    Xiangzheng Deng

    2012-06-01

    Full Text Available Bioenergy is currently regarded as a renewable energy source with a high growth potential. Forest-based biodiesel, with the significant advantage of not competing with grain production on cultivated land, has been considered as a promising substitute for diesel fuel by many countries, including China. Consequently, extracting biodiesel from Jatropha curcas has become a growing industry. However, many key issues related to the development of this industry are still not fully resolved and the prospects for this industry are complicated. The aim of this paper is to evaluate the net energy, CO2 emission, and cost efficiency of Jatropha biodiesel as a substitute fuel in China to help resolve some of the key issues by studying data from this region of China that is well suited to growing Jatropha. Our results show that: (1 Jatropha biodiesel is preferable for global warming mitigation over diesel fuel in terms of the carbon sink during Jatropha tree growth. (2 The net energy yield of Jatropha biodiesel is much lower than that of fossil fuel, induced by the high energy consumption during Jatropha plantation establishment and the conversion from seed oil to diesel fuel step. Therefore, the energy efficiencies of the production of Jatropha and its conversion to biodiesel need to be improved. (3 Due to current low profit and high risk in the study area, farmers have little incentive to continue or increase Jatropha production. (4 It is necessary to provide more subsidies and preferential policies for Jatropha plantations if this industry is to grow. It is also necessary for local government to set realistic objectives and make rational plans to choose proper sites for Jatropha biodiesel development and the work reported here should assist that effort. Future research focused on breading high-yield varieties, development of efficient field

  13. Liberalisation in network based industries. An economic analysis by case studies of railway, telecommunication and energy utilities; Liberalisierung von Netzindustrien. Eine oekonomische Analyse am Beispiel der Eisenbahn, der Telekommunikation und der Leitungsgebundenen Energieversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Schulze, A.

    2006-07-01

    The liberalisation of network based industries represents an economic problem, which raises on the one hand a multiplicity of theoretically unresolved questions and for which exist now on the other hand experiences in economic policy in Germany. The causes of the economic problems are not only to be found thereby in certain industry characteristics of the network based industries, but also in a missed special treatment of the network based economic sectors in the past by the economic policy. However, competition pushes in network based industries at borders, because the infrastructure necessary for the production of network based services typically represents an non-open to attack, natural monopoly in the hand of an established, vertically integrated supplier. From it extensive possibilities for the discrimination of competitors result, the competition political action need draw. It applies to analyze these in the available work and to discuss alternative solutions of the discrimination problem. (orig.)

  14. Analyses of Trawling Track and Fishing Activity Based on the Data of Vessel Monitoring System (VMS):A Case Study of the Single Otter Trawl Vessels in the Zhoushan Fishing Ground

    Institute of Scientific and Technical Information of China (English)

    WANG Yang; WANG Yingbin; ZHENG Ji

    2015-01-01

    The original purpose of Vessel Monitoring System (VMS) is for enforcement and control of vessel sailing. With the ap-plication of VMS in fishing vessels, more and more population dynamic studies have used VMS data to improve the accuracy of fisheries stock assessment. In this paper, we simulated the trawl trajectory under different time intervals using the cubic Hermite spline (cHs) interpolation method based on the VMS data of 8 single otter trawl vessels (totally 36000 data items) fishing in Zhou-shan fishing ground from September 2012 to December 2012, and selected the appropriate time interval. We then determined vessels’ activities (fishing or non-fishing) by comparing VMS speed data with the corresponding speeds from logbooks. The results showed that the error of simulated trajectory greatly increased with the increase of time intervals of VMS data when they were longer than 30 minutes. Comparing the speeds from VMS with those from the corresponding logbooks, we found that the vessels’ speeds were be-tween 2.5kn and 5.0kn in fishing. The cHs interpolation method is a new choice for improving the accuracy of estimation of sailing trajectory, and the VMS can be used to determine the vessels’ activities with the analysis of their trajectories and speeds. Therefore, when the fishery information is limited, VMS can be one of the important data sources for fisheries stock assessment, and more at-tention should be paid to its construction and application to fisheries stock assessment and management.

  15. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  16. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  17. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Melkerud, Per-Arne; Bain, Derek C.; Olsson, Mats T.

    2003-01-01

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO 3 , and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmol c m -2 yr -1 , amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  18. Voxel-based morphometry analyses of in-vivo MRI in the aging mouse lemur primate

    Directory of Open Access Journals (Sweden)

    Stephen John Sawiak

    2014-05-01

    Full Text Available Cerebral atrophy is one of the most widely brain alterations associated to aging. A clear relationship has been established between age-associated cognitive impairments and cerebral atrophy. The mouse lemur (Microcebus murinus is a small primate used as a model of age-related neurodegenerative processes. It is the first nonhuman primate in which cerebral atrophy has been correlated with cognitive deficits. Previous studies of cerebral atrophy in this model were based on time consuming manual delineation or measurement of selected brain regions from magnetic resonance images (MRI. These measures could not be used to analyse regions that cannot be easily outlined such as the nucleus basalis of Meynert or the subiculum. In humans, morphometric assessment of structural changes with age is generally performed with automated procedures such as voxel-based morphometry (VBM. The objective of our work was to perform user-independent assessment of age-related morphological changes in the whole brain of large mouse lemur populations thanks to VBM. The study was based on the SPMMouse toolbox of SPM 8 and involved thirty mouse lemurs aged from 1.9 to 11.3 years. The automatic method revealed for the first time atrophy in regions where manual delineation is prohibitive (nucleus basalis of Meynert, subiculum, prepiriform cortex, Brodmann areas 13-16, hypothalamus, putamen, thalamus, corpus callosum. Some of these regions are described as particularly sensitive to age-associated alterations in humans. The method revealed also age-associated atrophy in cortical regions (cingulate, occipital, parietal, nucleus septalis, and the caudate. Manual measures performed in some of these regions were in good agreement with results from automatic measures. The templates generated in this study as well as the toolbox for SPM8 can be downloaded. These tools will be valuable for future evaluation of various treatments that are tested to modulate cerebral aging in lemurs.

  19. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  20. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  1. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH ?

    Energy Technology Data Exchange (ETDEWEB)

    HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.

    2002-09-19

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.

  2. TAXONOMY AND GENETIC RELATIONSHIPS OF PANGASIIDAE, ASIAN CATFISHES, BASED ON MORPHOLOGICAL AND MOLECULAR ANALYSES

    Directory of Open Access Journals (Sweden)

    Rudhy Gustiano

    2007-12-01

    Full Text Available Pangasiids are economically important riverine catfishes generally residing in freshwater from the Indian subcontinent to the Indonesian Archipelago. The systematics of this family are still poorly known. Consequently, lack of such basic information impedes the understanding of the biology of the Pangasiids and the study of their aquaculture potential as well as improvement of seed production and growth performance. The objectives of the present study are to clarify phylogeny of this family based on a biometric analysis and molecular evidence using 12S ribosomal mtDNA on the total of 1070 specimens. The study revealed that 28 species are recognised as valid in Pangasiidae. Four genera are also recognized as Helicophagus Bleeker 1858, Pangasianodon Chevey 1930, Pteropangasius Fowler 1937, and Pangasius Valenciennes 1840 instead of two as reported by previous workers. The phylogenetic analysis demonstrated the recognised genera, and genetic relationships among taxa. Overall, trees from the different analyses show similar topologies and confirm the hypothesis derived from geological history, palaeontology, and similar models in other taxa of fishes from the same area. The oldest genus may already have existed when the Asian mainland was still connected to the islands in the southern part about 20 million years ago.

  3. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  4. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  5. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  6. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  7. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  8. Assessing the perceived quality of brachial artery Flow Mediated Dilation studies for inclusion in meta-analyses and systematic reviews: Description of data employed in the development of a scoring ;tool based on currently accepted guidelines

    Directory of Open Access Journals (Sweden)

    Arno Greyling

    2016-09-01

    Full Text Available Brachial artery Flow Mediated Dilation (FMD is widely used as a non-invasive measure of endothelial function. Adherence to expert consensus guidelines on FMD measurement has been found to be of vital importance to obtain reproducible data. This article lists the literature data which was considered in the development of a tool to aid in the objective judgement of the extent to which published studies adhered to expert guidelines for FMD measurement. Application of this tool in a systematic review of FMD studies (http://dx.doi.org/10.1016/j.atherosclerosis.2016.03.011 (Greyling et al., 2016 [1] indicated that adherence to expert consensus guidelines is strongly correlated to the reproducibility of FMD data. Keywords: Cardiovascular disease, Atherosclerosis, Endothelial function, Reproducibility, Methodology

  9. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  10. Determination of the spatial response of neutron based analysers using a Monte Carlo based method

    International Nuclear Information System (INIS)

    Tickner, James

    2000-01-01

    One of the principal advantages of using thermal neutron capture (TNC, also called prompt gamma neutron activation analysis or PGNAA) or neutron inelastic scattering (NIS) techniques for measuring elemental composition is the high penetrating power of both the incident neutrons and the resultant gamma-rays, which means that large sample volumes can be interrogated. Gauges based on these techniques are widely used in the mineral industry for on-line determination of the composition of bulk samples. However, attenuation of both neutrons and gamma-rays in the sample and geometric (source/detector distance) effects typically result in certain parts of the sample contributing more to the measured composition than others. In turn, this introduces errors in the determination of the composition of inhomogeneous samples. This paper discusses a combined Monte Carlo/analytical method for estimating the spatial response of a neutron gauge. Neutron propagation is handled using a Monte Carlo technique which allows an arbitrarily complex neutron source and gauge geometry to be specified. Gamma-ray production and detection is calculated analytically which leads to a dramatic increase in the efficiency of the method. As an example, the method is used to study ways of reducing the spatial sensitivity of on-belt composition measurements of cement raw meal

  11. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  13. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  14. Interface properties of SrTiO{sub 3}-based heterostructures studied by spectroscopy and high-resolution microscopy; Spektroskopie und hochaufloesende Mikroskopie zur Analyse der Grenzflaecheneigenschaften in SrTiO{sub 3}-basierten Heterostrukturen

    Energy Technology Data Exchange (ETDEWEB)

    Pfaff, Florian Georg

    2017-02-10

    growth conditions. In the present work, for instance, a significant increase in the charge carrier concentration as well as the 2DES spatial extension can be observed for samples grown at very low oxygen pressures, which is related to the creation of oxygen vacancies in SrTiO{sub 3} substrate. It is microscopically shown for the first time that sharp interfaces with a very low density of defects can also be grown at very low oxygen partial pressures. In addition, no significant effect of oxygen vacancies on specific structural properties is seen. Furthermore, a detailed analysis of the atomic spacing reveales a lattice distortion within the LaAlO{sub 3} film which shows a significant dependence on the used growth parameters and, supported by density functional theory, points towards a complex interplay of electronic reconstruction, surface oxygen vacancies and lattice distortions as the driving mechanism for the 2DES formation. Beside the study of the structural properties of the interface in LaAlO{sub 3}/SrTiO{sub 3} heterostructures by means of transmission electron microscopy, the electronic structure of the 2DES is analyzed by resonant inelastic X-ray scattering (RIXS) measurements which show clear indications for localized charge carriers below the critical thickness for conductivity of four unit cells. Moreover, a Raman- and a fluorescence-like signal can be identified by excitation energy dependent RIXS and attributed to the electronic character of the intermediate state. Similar results are obtained on γ-Al{sub 2}O{sub 3}/SrTiO{sub 3} heterostructures which fortifies this interpretation and could be a hint for a similar ground state in both heterostructures and interface magnetism also to be present in this system. By using resonant photoelectron spectroscopy the Ti 3d valence electrons can directly be observed and analyzed. Comparative measurements on LaAlO{sub 3}/SrTiO{sub 3} and γ-Al{sub 2}O{sub 3}/SrTiO{sub 3} indicate the existence of different types of

  15. Loss of Flow Accident (LOFA) analyses using LabView-based NRR simulator

    Energy Technology Data Exchange (ETDEWEB)

    Arafa, Amany Abdel Aziz; Saleh, Hassan Ibrahim [Atomic Energy Authority, Cairo (Egypt). Radiation Engineering Dept.; Ashoub, Nagieb [Atomic Energy Authority, Cairo (Egypt). Reactor Physics Dept.

    2016-12-15

    This paper presents a generic Loss of Flow Accident (LOFA) scenario module which is integrated in the LabView-based simulator to imitate a Nuclear Research Reactor (NRR) behavior for different user defined LOFA scenarios. It also provides analyses of a LOFA of a single fuel channel and its impact on operational transactions and on the behavior of the reactor. The generic LOFA scenario module includes graphs needed to clarify the effects of the LOFA under study. Furthermore, the percentage of the loss of mass flow rate, the mode of flow reduction and the start time and transient time of LOFA are user defined to add flexibility to the LOFA scenarios. The objective of integrating such generic LOFA module is to be able to deal with such incidents and avoid their significant effects. It is also useful in the development of expertise in this area and reducing the operator training and simulations costs. The results of the implemented generic LOFA module agree well with that of COBRA-IIIC code and the earlier guidebook for this series of transients.

  16. Register-based studies of healthcare costs

    DEFF Research Database (Denmark)

    Kruse, Marie; Christiansen, Terkel

    2011-01-01

    Introduction: The aim of this paper is to provide an overview and a few examples of how national registers are used in analyses of healthcare costs in Denmark. Research topics: The paper focuses on health economic analyses based on register data. For the sake of simplicity, the studies are divided...... into three main categories: economic evaluations of healthcare interventions, cost-of-illness analyses, and other analyses such as assessments of healthcare productivity. Conclusion: We examined a number of studies using register-based data on healthcare costs. Use of register-based data renders...

  17. A protein relational database and protein family knowledge bases to facilitate structure-based design analyses.

    Science.gov (United States)

    Mobilio, Dominick; Walker, Gary; Brooijmans, Natasja; Nilakantan, Ramaswamy; Denny, R Aldrin; Dejoannis, Jason; Feyfant, Eric; Kowticwar, Rupesh K; Mankala, Jyoti; Palli, Satish; Punyamantula, Sairam; Tatipally, Maneesh; John, Reji K; Humblet, Christine

    2010-08-01

    The Protein Data Bank is the most comprehensive source of experimental macromolecular structures. It can, however, be difficult at times to locate relevant structures with the Protein Data Bank search interface. This is particularly true when searching for complexes containing specific interactions between protein and ligand atoms. Moreover, searching within a family of proteins can be tedious. For example, one cannot search for some conserved residue as residue numbers vary across structures. We describe herein three databases, Protein Relational Database, Kinase Knowledge Base, and Matrix Metalloproteinase Knowledge Base, containing protein structures from the Protein Data Bank. In Protein Relational Database, atom-atom distances between protein and ligand have been precalculated allowing for millisecond retrieval based on atom identity and distance constraints. Ring centroids, centroid-centroid and centroid-atom distances and angles have also been included permitting queries for pi-stacking interactions and other structural motifs involving rings. Other geometric features can be searched through the inclusion of residue pair and triplet distances. In Kinase Knowledge Base and Matrix Metalloproteinase Knowledge Base, the catalytic domains have been aligned into common residue numbering schemes. Thus, by searching across Protein Relational Database and Kinase Knowledge Base, one can easily retrieve structures wherein, for example, a ligand of interest is making contact with the gatekeeper residue.

  18. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    La Pointe, P.R.; Wallmann, P.; Follin, S.

    1995-09-01

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  19. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  20. [Research on fast classification based on LIBS technology and principle component analyses].

    Science.gov (United States)

    Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng

    2014-11-01

    Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.

  1. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    Science.gov (United States)

    2016-09-01

    performance study of these algorithms in the particular problem of analysing backscatter signals from rotating blades. The report is organised as follows...provide further insight into the behaviour of the techniques. Here, the algorithms for MP, OMP, CGP, gOMP and ROMP terminate when 10 atoms are

  2. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  3. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. © The Author(s) 2016.

  4. Cost-of-illness studies and cost-effectiveness analyses in anxiety disorders: a systematic review.

    Science.gov (United States)

    Konnopka, Alexander; Leichsenring, Falk; Leibing, Eric; König, Hans-Helmut

    2009-04-01

    To review cost-of-illness studies (COI) and cost-effectiveness analyses (CEA) conducted for anxiety disorders. Based on a database search in Pubmed, PsychINFO and NHS EED, studies were classified according to various criteria. Cost data were inflated and converted to 2005 US-$ purchasing power parities (PPP). We finally identified 20 COI and 11 CEA of which most concentrated on panic disorder (PD) and generalized anxiety disorder (GAD). Differing inclusion of cost categories limited comparability of COI. PD and GAD tended to show higher direct costs per case, but lower direct cost per inhabitant than social and specific phobias. Different measures of effectiveness severely limited comparability of CEA. Overall CEA analysed 26 therapeutic or interventional strategies mostly compared to standard treatment, 8 of them resulting in lower better effectiveness and costs than the comparator. Anxiety disorders cause considerable costs. More research on phobias, more standardised inclusion of cost categories in COI and a wider use of comparable effectiveness measures (like QALYs) in CEA is needed.

  5. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  6. DESIGNING EAP MATERIALS BASED ON INTERCULTURAL CORPUS ANALYSES: THE CASE OF LOGICAL MARKERS IN RESEARCH ARTICLES

    Directory of Open Access Journals (Sweden)

    Pilar Mur Dueñas

    2009-10-01

    Full Text Available The ultimate aim of intercultural analyses in English for Academic Purposes is to help non-native scholars function successfully in the international disciplinary community in English. The aim of this paper is to show how corpus-based intercultural analyses can be useful to design EAP materials on a particular metadiscourse category, logical markers, in research article writing. The paper first describes the analysis carried out of additive, contrastive and consecutive logical markers in a corpus of research articles in English and in Spanish in a particular discipline, Business Management. Differences were found in their frequency and also in the use of each of the sub-categories. Then, five activities designed on the basis of these results are presented. They are aimed at raising Spanish Business scholars' awareness of the specific uses and pragmatic function of frequent logical markers in international research articles in English.

  7. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  8. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    Science.gov (United States)

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  9. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms.

    Science.gov (United States)

    Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T

    2016-05-23

    Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.

  10. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  11. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    Milani, Gabriele; Valente, Marco

    2014-01-01

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  12. Molecular systematics of Indian Alysicarpus (Fabaceae) based on analyses of nuclear ribosomal DNA sequences.

    Science.gov (United States)

    Gholami, Akram; Subramaniam, Shweta; Geeta, R; Pandey, Arun K

    2017-06-01

    Alysicarpus Necker ex Desvaux (Fabaceae, Desmodieae) consists of ~30 species that are distributed in tropical and subtropical regions of theworld. In India, the genus is represented by ca. 18 species, ofwhich seven are endemic. Sequences of the nuclear Internal transcribed spacer from38 accessions representing 16 Indian specieswere subjected to phylogenetic analyses. The ITS sequence data strongly support the monophyly of the genus Alysicarpus. Analyses revealed four major well-supported clades within Alysicarpus. Ancestral state reconstructions were done for two morphological characters, namely calyx length in relation to pod (macrocalyx and microcalyx) and pod surface ornamentation (transversely rugose and nonrugose). The present study is the first report on molecular systematics of Indian Alysicarpus.

  13. Activity Based Learning in a Freshman Global Business Course: Analyses of Preferences and Demographic Differences

    Science.gov (United States)

    Levine, Mark F.; Guy, Paul W.

    2007-01-01

    The present study investigates pre-business students' reaction to Activity Based Learning in a lower division core required course entitled Introduction to Global Business in the business curriculum at California State University Chico. The study investigates students' preference for Activity Based Learning in comparison to a more traditional…

  14. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  15. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  16. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  17. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  18. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  19. Reviewing PSA-based analyses to modify technical specifications at nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Martinez-Guridi, G.; Vesely, W.E.

    1995-12-01

    Changes to Technical Specifications (TSs) at nuclear power plants (NPPs) require review and approval by the United States Nuclear Regulatory Commission (USNRC). Currently, many requests for changes to TSs use analyses that are based on a plant's probabilistic safety assessment (PSA). This report presents an approach to reviewing such PSA-based submittals for changes to TSs. We discuss the basic objectives of reviewing a PSA-based submittal to modify NPP TSs; the methodology of reviewing a TS submittal, and the differing roles of a PSA review, a PSA Computer Code review, and a review of a TS submittal. To illustrate this approach, we discuss our review of changes to allowed outage time (AOT) and surveillance test interval (STI) in the TS for the South Texas Project Nuclear Generating Station. Based on this experience gained, a check-list of items is given for future reviewers; it can be used to verify that the submittal contains sufficient information, and also that the review has addressed the relevant issues. Finally, recommended steps in the review process and the expected findings of each step are discussed

  20. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  1. Devising a New Model of Demand-Based Learning Integrated with Social Networks and Analyses of its Performance

    Directory of Open Access Journals (Sweden)

    Bekim Fetaji

    2018-02-01

    Full Text Available The focus of the research study is to devise a new model for demand based learning that will be integrated with social networks such as Facebook, twitter and other. The study investigates this by reviewing the published literature and realizes a case study analyses in order to analyze the new models’ analytical perspectives of practical implementation. The study focuses on analyzing demand-based learning and investigating how it can be improved by devising a specific model that incorporates social network use. Statistical analyses of the results of the questionnaire through research of the raised questions and hypothesis showed that there is a need for introducing new models in the teaching process. The originality stands on the prologue of the social login approach to an educational environment, whereas the approach is counted as a contribution of developing a demand-based web application, which aims to modernize the educational pattern of communication, introduce the social login approach, and increase the process of knowledge transfer as well as improve learners’ performance and skills. Insights and recommendations are provided, argumented and discussed.

  2. A comparison between geostatistical analyses and sedimentological studies at the Hartbeestfontien gold mine

    International Nuclear Information System (INIS)

    Magri, E.J.

    1978-01-01

    For life-of-mine planning, as well as for short- and medium-term planning of grades and mine layouts, it is extremely important to have a clear understanding of the patterns followed by the distribution of gold and uranium within the mining area. This study is an attempt to reconcile the geostatistical approach to the determination of ore-shoot directions, via an analysis of the spatial distribution of gold and uranium values, with the sedimentological approach, which is based on the direct measurement of geological features. For the routine geostatistical estimation of ore reserves, the Hartebeestfontein gold mine was divided into ll sections. In each of these sections, the ore-shoot directions were calculated for gold and uranium from the anisotropies disclosed by geostatistical variogram analyses. This study presents a comparison of these results with those obtained from direct geological measurements of paleo-current directions. The results suggest that geological and geostatistical studies could be of significant mutual benefit [af

  3. Multi-Criteria Analyses of Urban Planning for City Expansion: A Case Study of Zamora, Spain

    Directory of Open Access Journals (Sweden)

    Marco Criado

    2017-10-01

    Full Text Available This study has established a methodology to determine the most environmentally suitable area for the expansion of Zamora (Spain using geographic information system (GIS technology. The objective was to develop a GIS-based methodology for the identification of urban peripheral areas that are suitable for the accommodation of new buildings and services, that are compliant with environmental criteria, and that guarantee an adequate quality of life for the future population such that extra construction costs are avoided. The methodological core is based on two multi-criteria analyses (MCAs: MCA-1 determines areas suitable for building—the most environmentally sustainable areas that do not present risks or discomforts to the population—by analyzing the restrictive factors; MCA-2 takes the sectors that received a favorable evaluation in MCA-1, determines which of those have a lower economic overhead for construction, and analyzes the different conditioning criteria related to their pre-existing infrastructures. Finally, the location of the sectors is determined by a decision factor that satisfies some strategic need of the municipality.

  4. The Hanford study: issues in analysing and interpreting data from occupational studies

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1987-01-01

    Updated analyses of workers at the Hanford Site provided no evidence of a correlation of radiation exposure and mortality from all cancers or mortality from leukemia. Potentially confounding factors were examined, and to the extent possible taken account of in these analyses. Risk estimates for leukemia and for all cancers except leukemia were calculated and compared with those from other sources. For leukemia, consideration was given to modifying factors such as age at exposure and time from exposure. (author)

  5. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  6. Sensitivity Study of Poisson's Ratio Used in Soil Structure Interaction (SSI) Analyses

    International Nuclear Information System (INIS)

    Han, Seung-ju; You, Dong-Hyun; Jang, Jung-bum; Yun, Kwan-hee

    2016-01-01

    The preliminary review for Design Certification (DC) of APR1400 was accepted by NRC on March 4, 2015. After the acceptance of the application for standard DC of APR1400, KHNP has responded the Request for Additional Information (RAI) raised by NRC to undertake a full design certification review. Design certification is achieved through the NRC's rulemaking process, and is founded on the staff's review of the application, which addresses the various safety issues associated with the proposed nuclear power plant design, independent of a specific site. The USNRC issued RAIs pertain to Design Control Document (DCD) Ch.3.7 'Seismic Design' is DCD Tables 3.7A-1 and 3.7A-2 show Poisson’s ratios in the S1 and S2 soil profiles used for SSI analysis as great as 0.47 and 0.48 respectively. Based on staff experience, use of Poisson's ratio approaching these values may result in numerical instability of the SSI analysis results. Sensitivity study is performed using the ACS SASSI NI model of APR1400 with S1 and S2 soil profiles to demonstrate that the Poisson’s ratio values used in the SSI analyses of S1 and S2 soil profile cases do not produce numerical instabilities in the SSI analysis results. No abrupt changes or spurious peaks, which tend to indicate existence of numerical sensitivities in the SASSI solutions, appear in the computed transfer functions of the original SSI analyses that have the maximum dynamic Poisson’s ratio values of 0.47 and 0.48 as well as in the re-computed transfer functions that have the maximum dynamic Poisson’s ratio values limited to 0.42 and 0.45

  7. Stress and deflection analyses of floating roofs based on a load-modifying method

    International Nuclear Information System (INIS)

    Sun Xiushan; Liu Yinghua; Wang Jianbin; Cen Zhangzhi

    2008-01-01

    This paper proposes a load-modifying method for the stress and deflection analyses of floating roofs used in cylindrical oil storage tanks. The formulations of loads and deformations are derived according to the equilibrium analysis of floating roofs. Based on these formulations, the load-modifying method is developed to conduct a geometrically nonlinear analysis of floating roofs with the finite element (FE) simulation. In the procedure with the load-modifying method, the analysis is carried out through a series of iterative computations until a convergence is achieved within the error tolerance. Numerical examples are given to demonstrate the validity and reliability of the proposed method, which provides an effective and practical numerical solution to the design and analysis of floating roofs

  8. CrusView: A Java-Based Visualization Platform for Comparative Genomics Analyses in Brassicaceae Species[OPEN

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-01-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/. PMID:23898041

  9. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  10. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.

    2010-01-01

    of long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser...

  11. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    Science.gov (United States)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  12. Altered Brain Activity in Unipolar Depression Revisited: Meta-analyses of Neuroimaging Studies.

    Science.gov (United States)

    Müller, Veronika I; Cieslik, Edna C; Serbanescu, Ilinca; Laird, Angela R; Fox, Peter T; Eickhoff, Simon B

    2017-01-01

    major depressive disorder. For meta-analyses with a minimum of 17 experiments available, separate analyses were performed for increases and decreases. In total, 57 studies with 99 individual neuroimaging experiments comprising in total 1058 patients were included; 34 of them tested cognitive and 65 emotional processing. Overall analyses across cognitive processing experiments (P > .29) and across emotional processing experiments (P > .47) revealed no significant results. Similarly, no convergence was found in analyses investigating positive (all P > .15), negative (all P > .76), or memory (all P > .48) processes. Analyses that restricted inclusion of confounds (eg, medication, comorbidity, age) did not change the results. Inconsistencies exist across individual experiments investigating aberrant brain activity in UD and replication problems across previous neuroimaging meta-analyses. For individual experiments, these inconsistencies may relate to use of uncorrected inference procedures, differences in experimental design and contrasts, or heterogeneous clinical populations; meta-analytically, differences may be attributable to varying inclusion and exclusion criteria or rather liberal statistical inference approaches.

  13. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  14. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    Science.gov (United States)

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  15. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin

    2017-09-23

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  16. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin; Drillon, Gué nola; Ryu, Taewoo; Voolstra, Christian R.; Aranda, Manuel

    2017-01-01

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  17. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  18. Analyses of microstructural and elastic properties of porous SOFC cathodes based on focused ion beam tomography

    Science.gov (United States)

    Chen, Zhangwei; Wang, Xin; Giuliani, Finn; Atkinson, Alan

    2015-01-01

    Mechanical properties of porous SOFC electrodes are largely determined by their microstructures. Measurements of the elastic properties and microstructural parameters can be achieved by modelling of the digitally reconstructed 3D volumes based on the real electrode microstructures. However, the reliability of such measurements is greatly dependent on the processing of raw images acquired for reconstruction. In this work, the actual microstructures of La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF) cathodes sintered at an elevated temperature were reconstructed based on dual-beam FIB/SEM tomography. Key microstructural and elastic parameters were estimated and correlated. Analyses of their sensitivity to the grayscale threshold value applied in the image segmentation were performed. The important microstructural parameters included porosity, tortuosity, specific surface area, particle and pore size distributions, and inter-particle neck size distribution, which may have varying extent of effect on the elastic properties simulated from the microstructures using FEM. Results showed that different threshold value range would result in different degree of sensitivity for a specific parameter. The estimated porosity and tortuosity were more sensitive than surface area to volume ratio. Pore and neck size were found to be less sensitive than particle size. Results also showed that the modulus was essentially sensitive to the porosity which was largely controlled by the threshold value.

  19. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    International Nuclear Information System (INIS)

    Clerc, F; Njiki-Menga, G-H; Witschger, O

    2013-01-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  20. Feasibility study on sensor data fusion for the CP-140 aircraft: fusion architecture analyses

    Science.gov (United States)

    Shahbazian, Elisa

    1995-09-01

    Loral Canada completed (May 1995) a Department of National Defense (DND) Chief of Research and Development (CRAD) contract, to study the feasibility of implementing a multi- sensor data fusion (MSDF) system onboard the CP-140 Aurora aircraft. This system is expected to fuse data from: (a) attributed measurement oriented sensors (ESM, IFF, etc.); (b) imaging sensors (FLIR, SAR, etc.); (c) tracking sensors (radar, acoustics, etc.); (d) data from remote platforms (data links); and (e) non-sensor data (intelligence reports, environmental data, visual sightings, encyclopedic data, etc.). Based on purely theoretical considerations a central-level fusion architecture will lead to a higher performance fusion system. However, there are a number of systems and fusion architecture issues involving fusion of such dissimilar data: (1) the currently existing sensors are not designed to provide the type of data required by a fusion system; (2) the different types (attribute, imaging, tracking, etc.) of data may require different degree of processing, before they can be used within a fusion system efficiently; (3) the data quality from different sensors, and more importantly from remote platforms via the data links must be taken into account before fusing; and (4) the non-sensor data may impose specific requirements on the fusion architecture (e.g. variable weight/priority for the data from different sensors). This paper presents the analyses performed for the selection of the fusion architecture for the enhanced sensor suite planned for the CP-140 aircraft in the context of the mission requirements and environmental conditions.

  1. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  2. Comparison of plasma input and reference tissue models for analysing [(11)C]flumazenil studies

    NARCIS (Netherlands)

    Klumpers, Ursula M. H.; Veltman, Dick J.; Boellaard, Ronald; Comans, Emile F.; Zuketto, Cassandra; Yaqub, Maqsood; Mourik, Jurgen E. M.; Lubberink, Mark; Hoogendijk, Witte J. G.; Lammertsma, Adriaan A.

    2008-01-01

    A single-tissue compartment model with plasma input is the established method for analysing [(11)C]flumazenil ([(11)C]FMZ) studies. However, arterial cannulation and measurement of metabolites are time-consuming. Therefore, a reference tissue approach is appealing, but this approach has not been

  3. Aroma profile of Garnacha Tintorera-based sweet wines by chromatographic and sensorial analyses.

    Science.gov (United States)

    Noguerol-Pato, R; González-Álvarez, M; González-Barreiro, C; Cancho-Grande, B; Simal-Gándara, J

    2012-10-15

    The aroma profiles obtained of three Garnacha Tintorera-based wines were studied: a base wine, a naturally sweet wine, and a mixture of naturally sweet wine with other sweet wine obtained by fortification with spirits. The aroma fingerprint was traced by GC-MS analysis of volatile compounds and by sensorial analysis of odours and tastes. Within the volatiles compounds, sotolon (73 μg/L) and acetoin (122 μg/L) were the two main compounds found in naturally sweet wine. With regards to the odorant series, those most dominant for Garnacha Tintorera base wine were floral, fruity and spicy. Instead, the most marked odorant series affected by off-vine drying of the grapes were floral, caramelized and vegetal-wood. Finally, odorant series affected by the switch-off of alcoholic fermentation with ethanol 96% (v/v) fit for human consumption followed by oak barrel aging were caramelized and vegetal-wood. A partial least square test (PLS-2) was used to detect correlations between sets of sensory data (those obtained with mouth and nose) with the ultimate aim of improving our current understanding of the flavour of Garnacha Tintorera red wines, both base and sweet. Based on the sensory dataset analysis, the descriptors with the highest weight for separating base and sweet wines from Garnacha Tintorera were sweetness, dried fruit and caramel (for sweet wines) vs. bitterness, astringency and geranium (for base wines). Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  5. Data analyses and modelling for risk based monitoring of mycotoxins in animal feed

    NARCIS (Netherlands)

    Ine van der Fels-Klerx, H.J.; Adamse, Paulien; Punt, Ans; Asselt, van Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study

  6. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  7. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  8. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  9. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  10. Groundwater flow analyses in Japan. 1. Case studies in Hokkaido and Northeast Japan

    International Nuclear Information System (INIS)

    Inaba, Hideo; Maekawa, Keisuke; Koide, Kaoru; Yanagizawa, Koichi

    1995-01-01

    An extensive study program has been carried out to estimate hydrogeological characteristics of deep underground in Japan. As a part of this program, groundwater flow analysis in Hokkaido and Northeast Japan were conducted. For the analyses of these area, hydrogeological models representing topography, geology, distribution of hydraulic conductivity were developed using available informations from open literature. By use of these models, steady state three-dimensional groundwater flow under a saturated/unsaturated condition was calculated by means of finite element method. The results are as follows: (1) Distribution of piezometric head corresponds with topography in the study area. (2) Piezometric head distribution is hydrostatic below E.L.-1000m in the study area. (3) Hydraulic gradient in the study area is less than 0.04 below E.L.-500m. (4) Difference of boundary conditions at the shore side of these models does not affect the results of the analyses. (author)

  11. Deconstructing tolerance with clobazam: Post hoc analyses from an open-label extension study.

    Science.gov (United States)

    Gidal, Barry E; Wechsler, Robert T; Sankar, Raman; Montouris, Georgia D; White, H Steve; Cloyd, James C; Kane, Mary Clare; Peng, Guangbin; Tworek, David M; Shen, Vivienne; Isojarvi, Jouko

    2016-10-25

    To evaluate potential development of tolerance to adjunctive clobazam in patients with Lennox-Gastaut syndrome. Eligible patients enrolled in open-label extension study OV-1004, which continued until clobazam was commercially available in the United States or for a maximum of 2 years outside the United States. Enrolled patients started at 0.5 mg·kg -1 ·d -1 clobazam, not to exceed 40 mg/d. After 48 hours, dosages could be adjusted up to 2.0 mg·kg -1 ·d -1 (maximum 80 mg/d) on the basis of efficacy and tolerability. Post hoc analyses evaluated mean dosages and drop-seizure rates for the first 2 years of the open-label extension based on responder categories and baseline seizure quartiles in OV-1012. Individual patient listings were reviewed for dosage increases ≥40% and increasing seizure rates. Data from 200 patients were included. For patients free of drop seizures, there was no notable change in dosage over 24 months. For responder groups still exhibiting drop seizures, dosages were increased. Weekly drop-seizure rates for 100% and ≥75% responders demonstrated a consistent response over time. Few patients had a dosage increase ≥40% associated with an increase in seizure rates. Two-year findings suggest that the majority of patients do not develop tolerance to the antiseizure actions of clobazam. Observed dosage increases may reflect best efforts to achieve seizure freedom. It is possible that the clinical development of tolerance to clobazam has been overstated. NCT00518713 and NCT01160770. This study provides Class III evidence that the majority of patients do not develop tolerance to clobazam over 2 years of treatment. © 2016 American Academy of Neurology.

  12. A Systematic Review of Cardiovascular Outcomes-Based Cost-Effectiveness Analyses of Lipid-Lowering Therapies.

    Science.gov (United States)

    Wei, Ching-Yun; Quek, Ruben G W; Villa, Guillermo; Gandra, Shravanthi R; Forbes, Carol A; Ryder, Steve; Armstrong, Nigel; Deshpande, Sohan; Duffy, Steven; Kleijnen, Jos; Lindgren, Peter

    2017-03-01

    Previous reviews have evaluated economic analyses of lipid-lowering therapies using lipid levels as surrogate markers for cardiovascular disease. However, drug approval and health technology assessment agencies have stressed that surrogates should only be used in the absence of clinical endpoints. The aim of this systematic review was to identify and summarise the methodologies, weaknesses and strengths of economic models based on atherosclerotic cardiovascular disease event rates. Cost-effectiveness evaluations of lipid-lowering therapies using cardiovascular event rates in adults with hyperlipidaemia were sought in Medline, Embase, Medline In-Process, PubMed and NHS EED and conference proceedings. Search results were independently screened, extracted and quality checked by two reviewers. Searches until February 2016 retrieved 3443 records, from which 26 studies (29 publications) were selected. Twenty-two studies evaluated secondary prevention (four also assessed primary prevention), two considered only primary prevention and two included mixed primary and secondary prevention populations. Most studies (18) based treatment-effect estimates on single trials, although more recent evaluations deployed meta-analyses (5/10 over the last 10 years). Markov models (14 studies) were most commonly used and only one study employed discrete event simulation. Models varied particularly in terms of health states and treatment-effect duration. No studies used a systematic review to obtain utilities. Most studies took a healthcare perspective (21/26) and sourced resource use from key trials instead of local data. Overall, reporting quality was suboptimal. This review reveals methodological changes over time, but reporting weaknesses remain, particularly with respect to transparency of model reporting.

  13. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    Science.gov (United States)

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  14. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    Science.gov (United States)

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  15. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  16. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies.

    Science.gov (United States)

    Nieuwenhuijsen, Mark J; Dadvand, Payam; Grellier, James; Martinez, David; Vrijheid, Martine

    2013-01-15

    Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently.The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB) exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  17. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies

    Directory of Open Access Journals (Sweden)

    Nieuwenhuijsen Mark J

    2013-01-01

    Full Text Available Abstract Background Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. Methods We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. Results The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently. The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. Conclusions The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  18. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  19. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  20. Teleseism-based Relative Time Corrections for Modern Analyses of Digitized Analog Seismograms

    Science.gov (United States)

    Lee, T. A.; Ishii, M.

    2017-12-01

    With modern-day instruments and seismic networks timed by GPS systems, synchronization of data streams is all but a forgone conclusion. However, during the analog era, when each station had its own clock, comparing data timing from different stations was a far more daunting prospect. Today, with recently developed methods by which analog data can be digitized, having the ability to accurately reconcile the timings of two separate stations would open decades worth of data to modern analyses. For example, one possible and exciting application would be using noise interferometry with digitized analog data in order to investigate changing structural features (on a volcano for example) over a much longer timescale than was previously possible. With this in mind, we introduce a new approach to sync time between stations based on teleseismic arrivals. P-wave arrivals are identified at stations for pairs of earthquakes from the digital and analog eras that have nearly identical distances, locations, and depths. Assuming accurate timing of the modern data, relative time corrections between a pair of stations can then be inferred for the analog data. This method for time correction depends upon the analog stations having modern equivalents, and both having sufficiently long durations of operation to allow for recording of usable teleseismic events. The Hawaii Volcano Observatory (HVO) network is an especially ideal environment for this, as it not only has a large and well-preserved collection of analog seismograms, but also has a long operating history (1912 - present) with many of the older stations having modern equivalents. As such, the scope of this project is to calculate and apply relative time corrections to analog data from two HVO stations, HILB (1919-present) and UWE (1928-present)(HILB now part of Pacific Tsunami network). Further application of this method could be for investigation of the effects of relative clock-drift, that is, the determining factor for how

  1. Molecular Characterization of Five Potyviruses Infecting Korean Sweet Potatoes Based on Analyses of Complete Genome Sequences

    Directory of Open Access Journals (Sweden)

    Hae-Ryun Kwak

    2015-12-01

    Full Text Available Sweet potatoes (Ipomea batatas L. are grown extensively, in tropical and temperate regions, and are important food crops worldwide. In Korea, potyviruses, including Sweet potato feathery mottle virus (SPFMV, Sweet potato virus C (SPVC, Sweet potato virus G (SPVG, Sweet potato virus 2 (SPV2, and Sweet potato latent virus (SPLV, have been detected in sweet potato fields at a high (~95% incidence. In the present work, complete genome sequences of 18 isolates, representing the five potyviruses mentioned above, were compared with previously reported genome sequences. The complete genomes consisted of 10,081 to 10,830 nucleotides, excluding the poly-A tails. Their genomic organizations were typical of the Potyvirus genus, including one target open reading frame coding for a putative polyprotein. Based on phylogenetic analyses and sequence comparisons, the Korean SPFMV isolates belonged to the strains RC and O with >98% nucleotide sequence identity. Korean SPVC isolates had 99% identity to the Japanese isolate SPVC-Bungo and 70% identity to the SPFMV isolates. The Korean SPVG isolates showed 99% identity to the three previously reported SPVG isolates. Korean SPV2 isolates had 97% identity to the SPV2 GWB-2 isolate from the USA. Korean SPLV isolates had a relatively low (88% nucleotide sequence identity with the Taiwanese SPLV-TW isolates, and they were phylogenetically distantly related to SPFMV isolates. Recombination analysis revealed that possible recombination events occurred in the P1, HC-Pro and NIa-NIb regions of SPFMV and SPLV isolates and these regions were identified as hotspots for recombination in the sweet potato potyviruses.

  2. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Joe, Yang Hee; Cho, Sung Gook

    2003-01-01

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  3. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    Science.gov (United States)

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  4. Sediment Characteristics of Mergui Basin, Andaman Sea based on Multi-proxy Analyses

    Directory of Open Access Journals (Sweden)

    Rina Zuraida

    2018-02-01

    Full Text Available This paper presents the characteristics of sediment from core BS-36 (6°55.85’ S and 96°7.48’ E, 1147.1 m water depth that was acquired in the Mergui Basin, Andaman Sea. The analyses involved megascopic description, core scanning by multi-sensor core logger, and carbonate content measurement. The purpose of this study is to determine the physical and chemical characteristics of sediment to infer the depositional environment. The results show that this core can be divided into 5 lithologic units that represent various environmental conditions. The sedimentation of the bottom part, Units V and IV were inferred to be deposited in suboxic to anoxic bottom condition combined with high productivity and low precipitation. Unit III was deposited during high precipitation and oxic condition due to ocean ventilation. In the upper part, Units II and I occurred during higher precipitation, higher carbonate production and suboxic to anoxic condition. Keywords: sediment characteristics, Mergui Basin, Andaman Sea, suboxic, anoxic, oxic, carbonate content

  5. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer: Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    Science.gov (United States)

    Khankari, Nikhil K.; Shu, Xiao-Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei

    2016-01-01

    Background Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using height-associated genetic variants identified in a genome-wide association study (GWAS), to evaluate the association of adult height with these cancers. Methods and Findings A systematic review of prospective studies was conducted using the PubMed, Embase, and Web of Science databases. Using meta-analyses, results obtained from 62 studies were summarized for the association of a 10-cm increase in height with cancer risk. Mendelian randomization analyses were conducted using summary statistics obtained for 423 genetic variants identified from a recent GWAS of adult height and from a cancer genetics consortium study of multiple cancers that included 47,800 cases and 81,353 controls. For a 10-cm increase in height, the summary relative risks derived from the meta-analyses of prospective studies were 1.12 (95% CI 1.10, 1.15), 1.07 (95% CI 1.05, 1.10), and 1.06 (95% CI 1.02, 1.11) for colorectal, prostate, and lung cancers, respectively. Mendelian randomization analyses showed increased risks of colorectal (odds ratio [OR] = 1.58, 95% CI 1.14, 2.18) and lung cancer (OR = 1.10, 95% CI 1.00, 1.22) associated with each 10-cm increase in genetically predicted height. No association was observed for prostate cancer (OR = 1.03, 95% CI 0.92, 1.15). Our meta-analysis was limited to published studies. The sample size for the Mendelian randomization analysis of colorectal cancer was relatively small, thus affecting the precision of the point estimate. Conclusions Our study provides evidence for a potential causal association of adult height with the risk of colorectal and lung cancers and suggests that certain genetic factors and biological pathways affecting adult height may also affect the

  6. Study and realization of a beam analyser of high intensity (10610)

    International Nuclear Information System (INIS)

    Perret-Gallix, D.

    1975-01-01

    A beam analyser working under high-beam intensity in the range of 10 6 to 10 10 particles per burst and giving position profile and intensity of this beam is studied. The reasons of this study, the principle of measurement, the construction of hardware and the different tests carried out on the chamber in order to evaluate the main features are related. The analyser is a multi-cellular ionisation chamber or stripe chamber; each cell made by a copper stripe (0.25mm wide) inserted between two high voltage planes (500V) forms a small independent ionisation chamber. This system, working under the on-line control of a mini-computer allows to associate to each event or event group the instantaneous position and profile of the beam [fr

  7. Trend analyses in the health behaviour in school-aged children study

    DEFF Research Database (Denmark)

    Schnohr, Christina W; Molcho, Michal; Rasmussen, Mette

    2015-01-01

    are considered. When analysing trends, researchers must be able to assess whether a change in prevalence is an expression of an actual change in the observed outcome, whether it is a result of methodological artefacts, or whether it is due to changes in the conceptualization of the outcome by the respondents....... CONCLUSION: The article present recommendations to take a number of the considerations into account. The considerations imply methodological challenges, which are core issues in undertaking trend analyses....... collecting data from adolescents aged 11-15 years, on a broad variety of health determinants and health behaviours. RESULTS: A number of methodological challenges have stemmed from the growth of the HBSC-study, in particular given that the study has a focus on monitoring trends. Some of those challenges...

  8. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    Directory of Open Access Journals (Sweden)

    Zhigang Zuo

    2014-04-01

    Full Text Available It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were discussed. A model with casing diameter of 0.875D40 was selected for further analyses. FEM analyses were then carried out on different combinations of casing sizes, casing wall thickness, and materials, to evaluate its safety related to pressure integrity, with respect to both static and fatigue strength analyses. Two models with forging and cast materials were selected as final results.

  9. Study on the atmospheric component with the scope of analyses on the environmental impact

    International Nuclear Information System (INIS)

    Ferrara, V.; La Camera, F.

    1989-03-01

    This work has been carried out following a specific request from Italian National Department for Environment and shows technical approaches and methodologies of analyses and forecasts set up for environmental impact studies referred to 'atmospheric environment'. This work is presented according to the general items and objectives fixed by the same Department in the wider operative system for the application in Italy of environmental impact procedures. (author)

  10. C4P cross-section libraries for safety analyses with SIMMER and related studies

    International Nuclear Information System (INIS)

    Rineiski, A.; Sinitsa, V.; Gabrielli, F.; Maschek, W.

    2011-01-01

    A code and data system, C 4 P, is under development at KIT. It includes fine-group master libraries and tools for generating problem-oriented cross-section libraries, primarily for safety studies with the SIMMER code and related analyses. In the paper, the 560-group master library and problem oriented 40-group and 72-group cross-section libraries, for thermal and fast systems, respectively, are described and their performances are investigated. (author)

  11. Deciphering chicken gut microbial dynamics based on high-throughput 16S rRNA metagenomics analyses.

    Science.gov (United States)

    Mohd Shaufi, Mohd Asrore; Sieo, Chin Chin; Chong, Chun Wie; Gan, Han Ming; Ho, Yin Wan

    2015-01-01

    Chicken gut microbiota has paramount roles in host performance, health and immunity. Understanding the topological difference in gut microbial community composition is crucial to provide knowledge on the functions of each members of microbiota to the physiological maintenance of the host. The gut microbiota profiling of the chicken was commonly performed previously using culture-dependent and early culture-independent methods which had limited coverage and accuracy. Advances in technology based on next-generation sequencing (NGS), offers unparalleled coverage and depth in determining microbial gut dynamics. Thus, the aim of this study was to investigate the ileal and caecal microbiota development as chicken aged, which is important for future effective gut modulation. Ileal and caecal contents of broiler chicken were extracted from 7, 14, 21 and 42-day old chicken. Genomic DNA was then extracted and amplified based on V3 hyper-variable region of 16S rRNA. Bioinformatics, ecological and statistical analyses such as Principal Coordinate Analysis (PCoA) was performed in mothur software and plotted using PRIMER 6. Additional analyses for predicted metagenomes were performed through PICRUSt and STAMP software package based on Greengenes databases. A distinctive difference in bacterial communities was observed between ilea and caeca as the chicken aged (P microbial communities in the caeca were more diverse in comparison to the ilea communities. The potentially pathogenic bacteria such as Clostridium were elevated as the chicken aged and the population of beneficial microbe such as Lactobacillus was low at all intervals. On the other hand, based on predicted metagenomes analysed, clear distinction in functions and roles of gut microbiota such as gene pathways related to nutrient absorption (e.g. sugar and amino acid metabolism), and bacterial proliferation and colonization (e.g. bacterial motility proteins, two-component system and bacterial secretion system) were

  12. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Directory of Open Access Journals (Sweden)

    H.J. (Ine van der Fels-Klerx

    2018-01-01

    Full Text Available Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products; all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials.

  13. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Science.gov (United States)

    van der Fels-Klerx, H.J. (Ine); Adamse, Paulien; Punt, Ans; van Asselt, Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products) and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products); all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials. PMID:29373559

  14. Exergy and energy analyses of two different types of PCM based thermal management systems for space air conditioning applications

    International Nuclear Information System (INIS)

    Tyagi, V.V.; Pandey, A.K.; Buddhi, D.; Tyagi, S.K.

    2013-01-01

    Highlights: ► Calcium chloride hexahydrate (CaCl 2 ⋅6H 2 O) as a PCM was used in this study. ► Two different capsulated system (HDPE based panel and balls) were designed. ► The results of CaCl 2 ⋅6H 2 O are very attractive for space air conditioning. ► Energy and exergy analyses for space cooling applications. - Abstract: This communication presents the experimental study of PCM based thermal management systems for space heating and cooling applications using energy and exergy analysis. Two different types of based thermal management system (TMS-I and TMS-II) using calcium chloride hexahydrate as the heat carrier has been designed, fabricated and studied for space heating and cooling applications at a typical climatic zone in India. In the first experimental arrangement the charging of PCM has been carried out with air conditioning system while discharging has been carried out using electric heater for both the thermal management systems. While in the second arrangement the charging of PCM has been carried out by solar energy and the discharging has been carried out by circulating the cooler ambient air during the night time. In the first experiment, TMS-I is found to be more effective than that of TMS-II while it was found to be reverse in the case of second experiment for both the charging and discharging processes not only for energetic but also for the exergetic performances

  15. 14C-analyses of calcite coatings in open fractures from the Klipperaas study site, Southern Sweden

    International Nuclear Information System (INIS)

    Possnert, G.; Tullborg, E.L.

    1989-11-01

    Carbonate samples from open fractures in crystalline rock from the Klipperaas study site have been analysed for their 14 C contents using accelerator mass spectrometry. This technique makes it possible to analyse very small carbonate samples (c. 1 mg C). The analyses show low but varying contents of 14 C. However, contamination by CO 2 have taken place affecting small samples more than others. Attempts have been made to quantify the contamination and thus evaluate the analyses of the fracture samples. The obtained low 14 C values can be due to: 1. An effective retention of 14 C by sorption/fractionation forcing 14 C onto the calcite surfaces in the near-surface zone which means that the 14 C contribution to the deeper levels is diminished or 2. the penetration depth of surface groundwater is very shallow. The former is suggested as more probable based on evaluations of the hydrochemical conditions and the fracture mineral studies. (10 figs., 3 tabs., 9 refs.) (authors)

  16. Non-localization and localization ROC analyses using clinically based scoring

    Science.gov (United States)

    Paquerault, Sophie; Samuelson, Frank W.; Myers, Kyle J.; Smith, Robert C.

    2009-02-01

    We are investigating the potential for differences in study conclusions when assessing the estimated impact of a computer-aided detection (CAD) system on readers' performance. The data utilized in this investigation were derived from a multi-reader multi-case observer study involving one hundred mammographic background images to which fixed-size and fixed-intensity Gaussian signals were added, generating a low- and high-intensity signal sets. The study setting allowed CAD assessment in two situations: when CAD sensitivity was 1) superior or 2) lower than the average reader. Seven readers were asked to review each set in the unaided and CAD-aided reading modes, mark and rate their findings. Using this data, we studied the effect on study conclusion of three clinically-based receiver operating characteristic (ROC) scoring definitions. These scoring definitions included both location-specific and non-location-specific rules. The results showed agreement in the estimated impact of CAD on the overall reader performance. In the study setting where CAD sensitivity is superior to the average reader, the mean difference in AUC between the CAD-aided read and unaided read was 0.049 (95%CIs: -0.027; 0.130) for the image scoring definition that is based on non-location-specific rules, and 0.104 (95%CIs: 0.036; 0.174) and 0.090 (95%CIs: 0.031; 0.155) for image scoring definitions that are based on location-specific rules. The increases in AUC were statistically significant for the location-specific scoring definitions. It was further observed that the variance on these estimates was reduced when using the location-specific scoring definitions compared to that using a non-location-specific scoring definition. In the study setting where CAD sensitivity is equivalent or lower than the average reader, the mean differences in AUC are slightly above 0.01 for all image scoring definitions. These increases in AUC were not statistical significant for any of the image scoring definitions

  17. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    International Nuclear Information System (INIS)

    Kitchen, Marcus J.; Pavlov, Konstantin M.; Hooper, Stuart B.; Vine, David J.; Siu, Karen K.W.; Wallace, Megan J.; Siew, Melissa L.L.; Yagi, Naoto; Uesugi, Kentaro; Lewis, Rob A.

    2008-01-01

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 μm thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution

  18. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kitchen, Marcus J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: Marcus.Kitchen@sci.monash.edu.au; Pavlov, Konstantin M. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia); Physics and Electronics, School of Science and Technology, University of New England, NSW 2351 (Australia)], E-mail: Konstantin.Pavlov@sci.monash.edu.au; Hooper, Stuart B. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Stuart.Hooper@med.monash.edu.au; Vine, David J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: David.Vine@sci.monash.edu.au; Siu, Karen K.W. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Karen.Siu@sci.monash.edu.au; Wallace, Megan J. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Megan.Wallace@med.monash.edu.au; Siew, Melissa L.L. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Melissa.Siew@med.monash.edu.au; Yagi, Naoto [SPring-8/JASRI, Sayo (Japan)], E-mail: yagi@spring8.or.jp; Uesugi, Kentaro [SPring-8/JASRI, Sayo (Japan)], E-mail: ueken@spring8.or.jp; Lewis, Rob A. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Rob.Lewis@sync.monash.edu.au

    2008-12-15

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 {mu}m thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution.

  19. Assessing an organizational culture instrument based on the Competing Values Framework: Exploratory and confirmatory factor analyses

    Science.gov (United States)

    Helfrich, Christian D; Li, Yu-Fang; Mohr, David C; Meterko, Mark; Sales, Anne E

    2007-01-01

    Background The Competing Values Framework (CVF) has been widely used in health services research to assess organizational culture as a predictor of quality improvement implementation, employee and patient satisfaction, and team functioning, among other outcomes. CVF instruments generally are presented as well-validated with reliable aggregated subscales. However, only one study in the health sector has been conducted for the express purpose of validation, and that study population was limited to hospital managers from a single geographic locale. Methods We used exploratory and confirmatory factor analyses to examine the underlying structure of data from a CVF instrument. We analyzed cross-sectional data from a work environment survey conducted in the Veterans Health Administration (VHA). The study population comprised all staff in non-supervisory positions. The survey included 14 items adapted from a popular CVF instrument, which measures organizational culture according to four subscales: hierarchical, entrepreneurial, team, and rational. Results Data from 71,776 non-supervisory employees (approximate response rate 51%) from 168 VHA facilities were used in this analysis. Internal consistency of the subscales was moderate to strong (α = 0.68 to 0.85). However, the entrepreneurial, team, and rational subscales had higher correlations across subscales than within, indicating poor divergent properties. Exploratory factor analysis revealed two factors, comprising the ten items from the entrepreneurial, team, and rational subscales loading on the first factor, and two items from the hierarchical subscale loading on the second factor, along with one item from the rational subscale that cross-loaded on both factors. Results from confirmatory factor analysis suggested that the two-subscale solution provides a more parsimonious fit to the data as compared to the original four-subscale model. Conclusion This study suggests that there may be problems applying conventional

  20. Assessing the validity of road safety evaluation studies by analysing causal chains.

    Science.gov (United States)

    Elvik, Rune

    2003-09-01

    This paper discusses how the validity of road safety evaluation studies can be assessed by analysing causal chains. A causal chain denotes the path through which a road safety measure influences the number of accidents. Two cases are examined. One involves chemical de-icing of roads (salting). The intended causal chain of this measure is: spread of salt --> removal of snow and ice from the road surface --> improved friction --> shorter stopping distance --> fewer accidents. A Norwegian study that evaluated the effects of salting on accident rate provides information that describes this causal chain. This information indicates that the study overestimated the effect of salting on accident rate, and suggests that this estimate is influenced by confounding variables the study did not control for. The other case involves a traffic club for children. The intended causal chain in this study was: join the club --> improve knowledge --> improve behaviour --> reduce accident rate. In this case, results are rather messy, which suggests that the observed difference in accident rate between members and non-members of the traffic club is not primarily attributable to membership in the club. The two cases show that by analysing causal chains, one may uncover confounding factors that were not adequately controlled in a study. Lack of control for confounding factors remains the most serious threat to the validity of road safety evaluation studies.

  1. Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database

    International Nuclear Information System (INIS)

    Lam, Chio; Zhou, Wenxing

    2016-01-01

    This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.

  2. White matter disruption in moderate/severe pediatric traumatic brain injury: Advanced tract-based analyses

    Directory of Open Access Journals (Sweden)

    Emily L. Dennis

    2015-01-01

    Full Text Available Traumatic brain injury (TBI is the leading cause of death and disability in children and can lead to a wide range of impairments. Brain imaging methods such as DTI (diffusion tensor imaging are uniquely sensitive to the white matter (WM damage that is common in TBI. However, higher-level analyses using tractography are complicated by the damage and decreased FA (fractional anisotropy characteristic of TBI, which can result in premature tract endings. We used the newly developed autoMATE (automated multi-atlas tract extraction method to identify differences in WM integrity. 63 pediatric patients aged 8–19 years with moderate/severe TBI were examined with cross sectional scanning at one or two time points after injury: a post-acute assessment 1–5 months post-injury and a chronic assessment 13–19 months post-injury. A battery of cognitive function tests was performed in the same time periods. 56 children were examined in the first phase, 28 TBI patients and 28 healthy controls. In the second phase 34 children were studied, 17 TBI patients and 17 controls (27 participants completed both post-acute and chronic phases. We did not find any significant group differences in the post-acute phase. Chronically, we found extensive group differences, mainly for mean and radial diffusivity (MD and RD. In the chronic phase, we found higher MD and RD across a wide range of WM. Additionally, we found correlations between these WM integrity measures and cognitive deficits. This suggests a distributed pattern of WM disruption that continues over the first year following a TBI in children.

  3. SeeSway - A free web-based system for analysing and exploring standing balance data.

    Science.gov (United States)

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate

  4. Importance of frequency dependent magnetoresistance measurements in analysing the intrinsicality of magnetodielectric effect: A case study

    Science.gov (United States)

    Rai, Hari Mohan; Saxena, Shailendra K.; Mishra, Vikash; Kumar, Rajesh; Sagdeo, P. R.

    2017-08-01

    Magnetodielectric (MD) materials have attracted considerable attention due to their intriguing physics and potential future applications. However, the intrinsicality of the MD effect is always a major concern in such materials as the MD effect may arise also due to the MR (magnetoresistance) effect. In the present case study, we report an experimental approach to analyse and separate the intrinsic and MR dominated contributions of the MD phenomenon. For this purpose, polycrystalline samples of LaGa1-xAxO3 (A = Mn/Fe) have been prepared by solid state reaction method. The purity of their structural phase (orthorhombic) has been validated by refining the X-ray diffraction data. The RTMD (room temperature MD) response has been recorded over a frequency range of 20 Hz to 10 MHz. In order to analyse the intrinsicality of the MD effect, FDMR (frequency dependent MR) by means of IS (impedance spectroscopy) and dc MR measurements in four probe geometry have been carried out at RT. A significant RTMD effect has been observed in selected Mn/Fe doped LaGaO3 (LGO) compositions. The mechanism of MR free/intrinsic MD effect, observed in Mn/Fe doped LGO, has been understood speculatively in terms of modified cell volume associated with the reorientation/retransformation of spin-coupled Mn/Fe orbitals due to the application of magnetic field. The present analysis suggests that in order to justify the intrinsic/resistive origin of the MD phenomenon, FDMR measurements are more useful than measuring only dc MR or analysing the trends of magnetic field dependent change in the dielectric constant and tanδ. On the basis of the present case study, we propose that IS (FDMR) alone can be used as an effective experimental tool to detect and analyse the resistive and intrinsic parts contributing to the MD phenomenon.

  5. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  6. Financial and Performance Analyses of Microcontroller Based Solar-Powered Autorickshaw for a Developing Country

    Directory of Open Access Journals (Sweden)

    Abu Raihan Mohammad Siddique

    2016-01-01

    Full Text Available This paper presents a case study to examine the economic viability and performance analysis of a microcontroller based solar powered battery operated autorickshaw (m-SBAR, for the developing countries, which is compared with different types of rickshaws such as pedal rickshaw (PR, battery operated autorickshaw (BAR, and solar-powered battery operated autorickshaw (SBAR, available in Bangladesh. The BAR consists of a rickshaw structure, a battery bank, a battery charge controller, a DC motor driver, and a DC motor whereas the proposed m-SBAR contains additional components like solar panel and microcontroller based DC motor driver. The complete design considered the local radiation data and load profile of the proposed m-SBAR. The Levelized Cost of Energy (LCOE analysis, Net Present Worth, payback periods, and Benefit-to-Cost Ratio methods have been used to evaluate the financial feasibility and sensitivity analysis of m-SBAR, grid-powered BAR, and PR. The numerical analysis reveals that LCOE and Benefit-to-Cost Ratio of the proposed m-SBAR are lower compared to the grid-powered BAR. It has also been found that microcontroller based DC motor control circuit reduces battery discharge rate, improves battery life, and controls motor speed efficiency.

  7. Augmentation of French grunt diet description using combined visual and DNA-based analyses

    Science.gov (United States)

    Hargrove, John S.; Parkyn, Daryl C.; Murie, Debra J.; Demopoulos, Amanda W.J.; Austin, James D.

    2012-01-01

    Trophic linkages within a coral-reef ecosystem may be difficult to discern in fish species that reside on, but do not forage on, coral reefs. Furthermore, dietary analysis of fish can be difficult in situations where prey is thoroughly macerated, resulting in many visually unrecognisable food items. The present study examined whether the inclusion of a DNA-based method could improve the identification of prey consumed by French grunt, Haemulon flavolineatum, a reef fish that possesses pharyngeal teeth and forages on soft-bodied prey items. Visual analysis indicated that crustaceans were most abundant numerically (38.9%), followed by sipunculans (31.0%) and polychaete worms (5.2%), with a substantial number of unidentified prey (12.7%). For the subset of prey with both visual and molecular data, there was a marked reduction in the number of unidentified sipunculans (visual – 31.1%, combined &ndash 4.4%), unidentified crustaceans (visual &ndash 15.6%, combined &ndash 6.7%), and unidentified taxa (visual &ndash 11.1%, combined &ndash 0.0%). Utilising results from both methodologies resulted in an increased number of prey placed at the family level (visual &ndash 6, combined &ndash 33) and species level (visual &ndash 0, combined &ndash 4). Although more costly than visual analysis alone, our study demonstrated the feasibility of DNA-based identification of visually unidentifiable prey in the stomach contents of fish.

  8. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    Science.gov (United States)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  9. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  10. The occurrence of Toxocara malaysiensis in cats in China, confirmed by sequence-based analyses of ribosomal DNA.

    Science.gov (United States)

    Li, Ming-Wei; Zhu, Xing-Quan; Gasser, Robin B; Lin, Rui-Qing; Sani, Rehana A; Lun, Zhao-Rong; Jacobs, Dennis E

    2006-10-01

    Non-isotopic polymerase chain reaction (PCR)-based single-strand conformation polymorphism and sequence analyses of the second internal transcribed spacer (ITS-2) of nuclear ribosomal DNA (rDNA) were utilized to genetically characterise ascaridoids from dogs and cats from China by comparison with those from other countries. The study showed that Toxocara canis, Toxocara cati, and Toxascaris leonina from China were genetically the same as those from other geographical origins. Specimens from cats from Guangzhou, China, which were morphologically consistent with Toxocara malaysiensis, were the same genetically as those from Malaysia, with the exception of a polymorphism in the ITS-2 but no unequivocal sequence difference. This is the first report of T. malaysiensis in cats outside of Malaysia (from where it was originally described), supporting the proposal that this species has a broader geographical distribution. The molecular approach employed provides a powerful tool for elucidating the biology, epidemiology, and zoonotic significance of T. malaysiensis.

  11. Comparison study of inelastic analyses for high temperature structure subjected to cyclic creep loading

    International Nuclear Information System (INIS)

    Kim, J. B.; Lee, H. Y.; Lee, J. H.

    2002-01-01

    It is necessary to develop a reliable numerical analysis method to simulate the plasticity and creep behavior of LMR high temperature structures. Since general purpose finite element analysis codes such as ABAQUS and ANSYS provide various models for plastic hardening and creep equation of Norton's power law, it is possible to perform the separate iscoplasticity analysis. In this study, the high temperature structural analysis program(NONSTA-VP) implementing Chaboche's unified visco plasticity equation into ABAQUS has been developed and the viscoplastic response of the 316 SS plate having a circular hole subjected to a cyclic creep loading has been analyzed. The results among the separate visco plasticity analyses and the unified visco plasticity analysis using NONSTA-VP have been compared and the results from NONSTA-VP shows remarkable responses of stress relaxation and creep behavior during hold time compared to those from separate visco plasticity analyses. Also, it is anticipated to reduce the conservatism arising from using elastic approach for creep-fatigue damage analysis since the stress range and the strain range from the unified visco plasticity analysis has been greatly reduced compared to those from separate visco plasticity analyses and elastic analysis

  12. Cycle O(CY1991) NLS trade studies and analyses report. Book 2, part 2: Propulsion

    Science.gov (United States)

    Cronin, R.; Werner, M.; Bonson, S.; Spring, R.; Houston, R.

    1992-01-01

    This report documents the propulsion system tasks performed in support of the National Launch System (NLS) Cycle O preliminary design activities. The report includes trades and analyses covering the following subjects: (1) Maximum Tank Stretch Study; (2) No LOX Bleed Performance Analysis; (3) LOX Bleed Trade Study; (4) LO2 Tank Pressure Limits; (5) LOX Tank Pressurization System Using Helium; (6) Space Transportation Main Engine (STME) Heat Exchanger Performance; (7) LH2 Passive Recirculation Performance Analysis; (8) LH2 Bleed/Recirculation Study; (9) LH2 Tank Pressure Limits; and (10) LH2 Pressurization System. For each trade study an executive summary and a detailed trade study are provided. For the convenience of the reader, a separate section containing a compilation of only the executive summaries is also provided.

  13. Fossil-based comparative analyses reveal ancient marine ancestry erased by extinction in ray-finned fishes.

    Science.gov (United States)

    Betancur-R, Ricardo; Ortí, Guillermo; Pyron, Robert Alexander

    2015-05-01

    The marine-freshwater boundary is a major biodiversity gradient and few groups have colonised both systems successfully. Fishes have transitioned between habitats repeatedly, diversifying in rivers, lakes and oceans over evolutionary time. However, their history of habitat colonisation and diversification is unclear based on available fossil and phylogenetic data. We estimate ancestral habitats and diversification and transition rates using a large-scale phylogeny of extant fish taxa and one containing a massive number of extinct species. Extant-only phylogenetic analyses indicate freshwater ancestry, but inclusion of fossils reveal strong evidence of marine ancestry in lineages now restricted to freshwaters. Diversification and colonisation dynamics vary asymmetrically between habitats, as marine lineages colonise and flourish in rivers more frequently than the reverse. Our study highlights the importance of including fossils in comparative analyses, showing that freshwaters have played a role as refuges for ancient fish lineages, a signal erased by extinction in extant-only phylogenies. © 2015 John Wiley & Sons Ltd/CNRS.

  14. Conformational determination of [Leu]enkephalin based on theoretical and experimental VA and VCD spectral analyses

    DEFF Research Database (Denmark)

    Abdali, Salim; Jalkanen, Karl J.; Cao, X.

    2004-01-01

    Conformational determination of [Leu]enkephalin in DMSO-d6 is carried out using VA and VCD spectral analyses. Conformational energies, vibrational frequencies and VA and VCD intensities are calculated using DFT at B3LYP/6-31G* level of theory. Comparison between the measured spectra...

  15. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    Science.gov (United States)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  16. Chemical and geotechnical analyses of soil samples from Olkiluoto for studies on sorption in soils

    International Nuclear Information System (INIS)

    Lusa, M.; Aemmaelae, K.; Hakanen, M.; Lehto, J.; Lahdenperae, A.-M.

    2009-05-01

    The safety assessment of disposal of spent nuclear fuel will include an estimate on the behavior of nuclear waste nuclides in the biosphere. As a part of this estimate also the transfer of nuclear waste nuclides in the soil and sediments is to be considered. In this study soil samples were collected from three excavator pits in Olkiluoto and the geotechnical and chemical characteristics of the samples were determined. In later stage these results will be used in sorption tests. Aim of these tests is to determine the Kd-values for Cs, Tc and I and later for Mo, Nb and Cl. Results of these sorption tests will be reported later. The geotechnical characteristics studied included dry weight and organic matter content as well as grain size distribution and mineralogy analyses. Selective extractions were carried out to study the sorption of cations into different mineral types. The extractions included five steps in which the cations bound to exchangeable, carbonate, oxides of Fe and Mn, organic matter and residual fractions were determined. For all fractions ICPMS analyses were carried out. In these analyses Li, Na, Mg, K, Ca, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Mo, Cd, Cs and Pb were determined. In addition six profiles were taken from the surroundings of two excavator pits for the 137 Cs determination. Besides the samples taken for the characterization of soil, supplement samples were taken from the same layers for the separation of soil water. From the soil water pH, DOC, anions (F, Cl, NO 3 , SO 4 ) and cations (Na, Mg, K, Ca, Al, Cr, Mn, Fe, Ni, Cu, Zn, As, S, Cd, Cs, Pb, U) were determined. (orig.)

  17. Pareto frontier analyses based decision making tool for transportation of hazardous waste

    International Nuclear Information System (INIS)

    Das, Arup; Mazumder, T.N.; Gupta, A.K.

    2012-01-01

    Highlights: ► Posteriori method using multi-objective approach to solve bi-objective routing problem. ► System optimization (with multiple source–destination pairs) in a capacity constrained network using non-dominated sorting. ► Tools like cost elasticity and angle based focus used to analyze Pareto frontier to aid stakeholders make informed decisions. ► A real life case study of Kolkata Metropolitan Area to explain the workability of the model. - Abstract: Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology.

  18. Group analyses of connectivity-based cortical parcellation using repeated k-means clustering.

    Science.gov (United States)

    Nanetti, Luca; Cerliani, Leonardo; Gazzola, Valeria; Renken, Remco; Keysers, Christian

    2009-10-01

    K-means clustering has become a popular tool for connectivity-based cortical segmentation using Diffusion Weighted Imaging (DWI) data. A sometimes ignored issue is, however, that the output of the algorithm depends on the initial placement of starting points, and that different sets of starting points therefore could lead to different solutions. In this study we explore this issue. We apply k-means clustering a thousand times to the same DWI dataset collected in 10 individuals to segment two brain regions: the SMA-preSMA on the medial wall, and the insula. At the level of single subjects, we found that in both brain regions, repeatedly applying k-means indeed often leads to a variety of rather different cortical based parcellations. By assessing the similarity and frequency of these different solutions, we show that approximately 256 k-means repetitions are needed to accurately estimate the distribution of possible solutions. Using nonparametric group statistics, we then propose a method to employ the variability of clustering solutions to assess the reliability with which certain voxels can be attributed to a particular cluster. In addition, we show that the proportion of voxels that can be attributed significantly to either cluster in the SMA and preSMA is relatively higher than in the insula and discuss how this difference may relate to differences in the anatomy of these regions.

  19. A MULTI-AGENT BASED SOCIAL CRM FRAMEWORK FOR EXTRACTING AND ANALYSING OPINIONS

    Directory of Open Access Journals (Sweden)

    ABDELAZIZ EL FAZZIKI

    2017-08-01

    Full Text Available Social media provide a wide space for people from around the world to communicate, share knowledge and personal experiences. They increasingly become an important data source for opinion mining and sentiment analysis, thanks to shared comments and reviews about products and services. And companies are showing a growing interest to harness their potential, in order to support setting up marketing strategies. Despite the importance of sentiment analysis in decision making, there is a lack of social intelligence integration at the level of customer relationship management systems. Thus, social customer relationship management (SCRM systems have become an interesting research area. However, they need deep analytic techniques to transform the large amount of data “Big Data” into actionable insights. Such systems also require an advanced modelling and data processing methods, and must consider the emerging paradigm related to proactive systems. In this paper, we propose an agent based social framework that extracts and consolidates the reviews expressed via social media, in order to help enterprises know more about customers’ opinions toward a particular product or service. To illustrate our approach, we present the case study of Twitter reviews that we use to extract opinions and sentiment about a set of products using SentiGem API. Data extraction, analysis and storage are performed using a framework based on Hadoop MapReduce and HBase.

  20. Scenario-based analyses of energy system development and its environmental implications in Thailand

    International Nuclear Information System (INIS)

    Shrestha, Ram M.; Malla, Sunil; Liyanage, Migara H.

    2007-01-01

    Thailand is one of the fastest growing energy-intensive economies in Southeast Asia. To formulate sound energy policies in the country, it is important to understand the impact of energy use on the environment over the long-period. This study examines energy system development and its associated greenhouse gas and local air pollutant emissions under four scenarios in Thailand through the year 2050. The four scenarios involve different growth paths for economy, population, energy efficiency and penetration of renewable energy technologies. The paper assesses the changes in primary energy supply mix, sector-wise final energy demand, energy import dependency and CO 2 , SO 2 and NO x emissions under four scenarios using end-use based Asia-Pacific Integrated Assessment Model (AIM/Enduse) of Thailand. (author)

  1. Pattern Analyses Reveal Separate Experience-Based Fear Memories in the Human Right Amygdala.

    Science.gov (United States)

    Braem, Senne; De Houwer, Jan; Demanet, Jelle; Yuen, Kenneth S L; Kalisch, Raffael; Brass, Marcel

    2017-08-23

    Learning fear via the experience of contingencies between a conditioned stimulus (CS) and an aversive unconditioned stimulus (US) is often assumed to be fundamentally different from learning fear via instructions. An open question is whether fear-related brain areas respond differently to experienced CS-US contingencies than to merely instructed CS-US contingencies. Here, we contrasted two experimental conditions where subjects were instructed to expect the same CS-US contingencies while only one condition was characterized by prior experience with the CS-US contingency. Using multivoxel pattern analysis of fMRI data, we found CS-related neural activation patterns in the right amygdala (but not in other fear-related regions) that dissociated between whether a CS-US contingency had been instructed and experienced versus merely instructed. A second experiment further corroborated this finding by showing a category-independent neural response to instructed and experienced, but not merely instructed, CS presentations in the human right amygdala. Together, these findings are in line with previous studies showing that verbal fear instructions have a strong impact on both brain and behavior. However, even in the face of fear instructions, the human right amygdala still shows a separable neural pattern response to experience-based fear contingencies. SIGNIFICANCE STATEMENT In our study, we addressed a fundamental problem of the science of human fear learning and memory, namely whether fear learning via experience in humans relies on a neural pathway that can be separated from fear learning via verbal information. Using two new procedures and recent advances in the analysis of brain imaging data, we localized purely experience-based fear processing and memory in the right amygdala, thereby making a direct link between human and animal research. Copyright © 2017 the authors 0270-6474/17/378116-15$15.00/0.

  2. Regional analyses of labor markets and demography: a model based Norwegian example.

    Science.gov (United States)

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  3. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    OpenAIRE

    Zhigang Zuo; Shuhong Liu; Yizhang Fan; Yulin Wu

    2014-01-01

    It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were...

  4. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic - Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences.

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered.

  5. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic – Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered. PMID:28122062

  6. Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses

    Science.gov (United States)

    Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.

    2017-12-01

    To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number

  7. Actual situation analyses of rat-run traffic on community streets based on car probe data

    Science.gov (United States)

    Sakuragi, Yuki; Matsuo, Kojiro; Sugiki, Nao

    2017-10-01

    Lowering of so-called "rat-run" traffic on community streets has been one of significant challenges for improving the living environment of neighborhood. However, it has been difficult to quantitatively grasp the actual situation of rat-run traffic by the traditional surveys such as point observations. This study aims to develop a method for extracting rat-run traffic based on car probe data. In addition, based on the extracted rat-run traffic in Toyohashi city, Japan, we try to analyze the actual situation such as time and location distribution of the rat-run traffic. As a result, in Toyohashi city, the rate of using rat-run route increases in peak time period. Focusing on the location distribution of rat-run traffic, in addition, they pass through a variety of community streets. There is no great inter-district bias of the route frequently used as rat-run traffic. Next, we focused on some trips passing through a heavily used route as rat-run traffic. As a result, we found the possibility that they habitually use the route as rat-run because their trips had some commonalities. We also found that they tend to use the rat-run route due to shorter distance than using the alternative highway route, and that the travel speeds were faster than using the alternative highway route. In conclusions, we confirmed that the proposed method can quantitatively grasp the actual situation and the phenomenal tendencies of the rat-run traffic.

  8. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    William Cross, R.; Montemorra, Michael

    2012-01-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in 2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  9. Identification of provenance rocks based on EPMA analyses of heavy minerals

    Science.gov (United States)

    Shimizu, M.; Sano, N.; Ueki, T.; Yonaga, Y.; Yasue, K. I.; Masakazu, N.

    2017-12-01

    Information on mountain building is significant in the field of geological disposal of high-level radioactive waste, because this affects long-term stability in groundwater flow system. Provenance analysis is one of effective approaches for understanding building process of mountains. Chemical compositions of heavy minerals, as well as their chronological data, can be an index for identification of provenance rocks. The accurate identification requires the measurement of as many grains as possible. In order to achieve an efficient provenance analysis, we developed a method for quick identification of heavy minerals using an Electron Probe Micro Analyzer (EPMA). In this method, heavy mineral grains extracted from a sample were aligned on a glass slide and mounted in a resin. Concentration of 28 elements was measured for 300-500 grains per sample using EPMA. To measure as many grains as possible, we prioritized swiftness of measurement over precision, configuring measurement time of about 3.5 minutes for each grain. Identification of heavy minerals was based on their chemical composition. We developed a Microsoft® Excel® spread sheet input criteria of mineral identification using a typical range of chemical compositions for each mineral. The grains of 110 wt.% total were rejected. The criteria of mineral identification were revised through the comparison between mineral identification by optical microscopy and chemical compositions of grains classified as "unknown minerals". Provenance rocks can be identified based on abundance ratio of identified minerals. If no significant difference of the abundance ratio was found among source rocks, chemical composition of specific minerals was used as another index. This method was applied to the sediments of some regions in Japan where provenance rocks had lithological variations but similar formation ages. Consequently, the provenance rocks were identified based on chemical compositions of heavy minerals resistant to

  10. Atmospheric radiation environment analyses based-on CCD camera at various mountain altitudes and underground sites

    Directory of Open Access Journals (Sweden)

    Li Cavoli Pierre

    2016-01-01

    Full Text Available The purpose of this paper is to discriminate secondary atmospheric particles and identify muons by measuring the natural radiative environment in atmospheric and underground locations. A CCD camera has been used as a cosmic ray sensor. The Low Noise Underground Laboratory of Rustrel (LSBB, France gives the access to a unique low-noise scientific environment deep enough to ensure the screening from the neutron and proton radiative components. Analyses of the charge levels in pixels of the CCD camera induced by radiation events and cartographies of the charge events versus the hit pixel are proposed.

  11. UniPrimer: A Web-Based Primer Design Tool for Comparative Analyses of Primate Genomes

    Directory of Open Access Journals (Sweden)

    Nomin Batnyam

    2012-01-01

    Full Text Available Whole genome sequences of various primates have been released due to advanced DNA-sequencing technology. A combination of computational data mining and the polymerase chain reaction (PCR assay to validate the data is an excellent method for conducting comparative genomics. Thus, designing primers for PCR is an essential procedure for a comparative analysis of primate genomes. Here, we developed and introduced UniPrimer for use in those studies. UniPrimer is a web-based tool that designs PCR- and DNA-sequencing primers. It compares the sequences from six different primates (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque and designs primers on the conserved region across species. UniPrimer is linked to RepeatMasker, Primer3Plus, and OligoCalc softwares to produce primers with high accuracy and UCSC In-Silico PCR to confirm whether the designed primers work. To test the performance of UniPrimer, we designed primers on sample sequences using UniPrimer and manually designed primers for the same sequences. The comparison of the two processes showed that UniPrimer was more effective than manual work in terms of saving time and reducing errors.

  12. Analyses of Large Coal-Based SOFCs for High Power Stack Block Development

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P; Koeppel, Brian J

    2010-10-01

    This report summarizes the numerical modeling and analytical efforts for SOFC stack development performed for the coal-based SOFC program. The stack modeling activities began in 2004, but this report focuses on the most relevant results obtained since August 2008. This includes the latter half of Phase-I and all of Phase-II activities under technical guidance of VPS and FCE. The models developed to predict the thermal-flow-electrochemical behaviors and thermal-mechanical responses of generic planar stacks and towers are described. The effects of cell geometry, fuel gas composition, on-cell reforming, operating conditions, cell performance, seal leak, voltage degradation, boundary conditions, and stack height are studied. The modeling activities to evaluate and achieve technical targets for large stack blocks are described, and results from the latest thermal-fluid-electrochemical and structural models are summarized. Modeling results for stack modifications such as scale-up and component thickness reduction to realize cost reduction are presented. Supporting modeling activities in the areas of cell fabrication and loss of contact are also described.

  13. Theoretical study for a digital transfer function analyser; Etude theorique pour un transferometre digital

    Energy Technology Data Exchange (ETDEWEB)

    Freycenon, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    This study deals with the harmonic analysis of the instantaneous counting rate of a pulse train. This arises from using a fission chamber for reactivity to power transfer function measurements by oscillation methods in reactors. The systematical errors due to the sampling process are computed. The integration carried out when sampling the signal modifies the formulae of the Nyquist theorem on spectrum folding. The statistical errors due to the noise are analysed: it is shown that the bandwidth of the spectral window applied to the noise frequency spectrum is equal to the inverse of the time duration of the experiment. A dead time of 25 per cent of the sampling time does not increase appreciably the bandwidth. A new method is proposed afterwards yielding very approximate results of the Fourier analysis during the experiment. The systematical errors arising from the measuring process are determined, and it is shown that the bandwidth of the corresponding spectral window is still given by the inverse of the time duration of the experiment. (author) [French] Cette etude se rapporte a l'analyse harmonique de la valeur instantanee du taux de comptage d'une suite d'impulsions. On rencontre ce probleme dans l'utilisation de chambres a fission pour les mesures de fonction de transfert reactivite-puissance par la methode d'oscillation dans les piles. On calcule l'erreur systematique due au processus d'echantillonnage ou l'integration operee modifie les formules classiques de recouvrement du spectre. On analyse ensuite les erreurs statistiques dues au bruit de fond. On montre que la largeur de bande de la fenetre spectrale appliquee au spectre de puissance du bruit est donnee par l'inverse du temps de mesure. Un temps mort de 25 pour cent du temps de prelevement n'accroit pas sensiblement cette largeur de bande. On propose ensuite un procede simple qui permet d'obtenir, en cours d'experience, des resultats tres approches de l'analyse de Fourier. On determine les erreurs

  14. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  15. Phylogenetic study on Shiraia bambusicola by rDNA sequence analyses.

    Science.gov (United States)

    Cheng, Tian-Fan; Jia, Xiao-Ming; Ma, Xiao-Hang; Lin, Hai-Ping; Zhao, Yu-Hua

    2004-01-01

    In this study, 18S rDNA and ITS-5.8S rDNA regions of four Shiraia bambusicola isolates collected from different species of bamboos were amplified by PCR with universal primer pairs NS1/NS8 and ITS5/ITS4, respectively, and sequenced. Phylogenetic analyses were conducted on three selected datasets of rDNA sequences. Maximum parsimony, distance and maximum likelihood criteria were used to infer trees. Morphological characteristics were also observed. The positioning of Shiraia in the order Pleosporales was well supported by bootstrap, which agreed with the placement by Amano (1980) according to their morphology. We did not find significant inter-hostal differences among these four isolates from different species of bamboos. From the results of analyses and comparison of their rDNA sequences, we conclude that Shiraia should be classified into Pleosporales as Amano (1980) proposed and suggest that it might be positioned in the family Phaeosphaeriaceae. Copyright 2004 WILEY-VCH Verlag GmbH & Co.

  16. Grid Mapping for Spatial Pattern Analyses of Recurrent Urban Traffic Congestion Based on Taxi GPS Sensing Data

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-03-01

    Full Text Available Traffic congestion is one of the most serious problems that impact urban transportation efficiency, especially in big cities. Identifying traffic congestion locations and occurring patterns is a prerequisite for urban transportation managers in order to take proper countermeasures for mitigating traffic congestion. In this study, the historical GPS sensing data of about 12,000 taxi floating cars in Beijing were used for pattern analyses of recurrent traffic congestion based on the grid mapping method. Through the use of ArcGIS software, 2D and 3D maps of the road network congestion were generated for traffic congestion pattern visualization. The study results showed that three types of traffic congestion patterns were identified, namely: point type, stemming from insufficient capacities at the nodes of the road network; line type, caused by high traffic demand or bottleneck issues in the road segments; and region type, resulting from multiple high-demand expressways merging and connecting to each other. The study illustrated that the proposed method would be effective for discovering traffic congestion locations and patterns and helpful for decision makers to take corresponding traffic engineering countermeasures in order to relieve the urban traffic congestion issues.

  17. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  18. Ultrastructure of spermatozoa of spider crabs, family Mithracidae (Crustacea, Decapoda, Brachyura): Integrative analyses based on morphological and molecular data.

    Science.gov (United States)

    Assugeni, Camila de O; Magalhães, Tatiana; Bolaños, Juan A; Tudge, Christopher C; Mantelatto, Fernando L; Zara, Fernando J

    2017-12-01

    Recent studies based on morphological and molecular data provide a new perspective concerning taxonomic aspects of the brachyuran family Mithracidae. These studies proposed a series of nominal changes and indicated that the family is actually represented by a different number and representatives of genera than previously thought. Here, we provide a comparative description of the ultrastructure of spermatozoa and spermatophores of some species of Mithracidae in a phylogenetic context. The ultrastructure of the spermatozoa and spermatophore was observed by scanning and transmission electron microscopy. The most informative morphological characters analysed were thickness of the operculum, shape of the perforatorial chamber and shape and thickness of the inner acrosomal zone. As a framework, we used a topology based on a phylogenetic analysis using mitochondrial data obtained here and from previous studies. Our results indicate that closely related species share a series of morphological characteristics of the spermatozoa. A thick operculum, for example, is a feature observed in species of the genera Amphithrax, Teleophrys, and Omalacantha in contrast to the slender operculum observed in Mithraculus and Mithrax. Amphithrax and Teleophrys have a rhomboid perforatorial chamber, while Mithraculus, Mithrax, and Omalacantha show a wider, deltoid morphology. Furthermore, our results are in agreement with recently proposed taxonomic changes including the separation of the genera Mithrax (previously Damithrax), Amphithrax (previously Mithrax) and Mithraculus, and the synonymy of Mithrax caribbaeus with Mithrax hispidus. Overall, the spermiotaxonomy of these species of Mithracidae represent a novel set of data that corroborates the most recent taxonomic revision of the family and can be used in future taxonomic and phylogenetic studies within this family. © 2017 Wiley Periodicals, Inc.

  19. Surrogacy of progression free survival for overall survival in metastatic breast cancer studies: Meta-analyses of published studies.

    Science.gov (United States)

    Kundu, Madan G; Acharyya, Suddhasatta

    2017-02-01

    PFS is often used as a surrogate endpoint for OS in metastatic breast cancer studies. We have evaluated the association of treatment effect on PFS with significant HR OS (and how this association is affected by other factors) in published prospective metastatic breast cancer studies. A systematic literature search in PubMed identified prospective metastatic breast cancer studies. Treatment effects on PFS were determined using hazard ratio (HR PFS ), increase in median PFS (ΔMED PFS ) and % increase in median PFS (%ΔMED PFS ). Diagnostic accuracy of PFS measures (HR PFS , ΔMED PFS and %ΔMED PFS ) in predicting significant HR OS was assessed using receiver operating characteristic (ROC) curves and classification tree approach (CART). Seventy-four cases (i.e., treatment to control comparisons) from 65 individual publications were identified for the analyses. Of these, 16 cases reported significant treatment effect on HR OS at 5% level of significance. Median number of deaths reported in these cases were 153. Area under the ROC curve (AUC) for diagnostic measures as HR PFS , ΔMED PFS and %ΔMED PFS were 0.69, 0.70 and 0.75, respectively. Classification tree results identified %ΔMED PFS and number of deaths as diagnostic measure for significant HR OS . Only 7.9% (3/39) cases with ΔMED PFS shorter than 48.27% reported significant HR OS . There were 7 cases with ΔMED PFS of 48.27% or more and number of deaths reported as 227 or more - of these 5 cases reported significant HR OS . %ΔMED PFS was found to be a better diagnostic measure for predicting significant HR OS . Our analysis results also suggest that consideration of total number of deaths may further improve its diagnostic performance. Based on our study results, the studies with 50% improvement in median PFS are more likely to produce significant HR OS if the total number of OS events at the time of analysis is 227 or more. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Science.gov (United States)

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  1. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Directory of Open Access Journals (Sweden)

    Arika Ligmann-Zielinska

    Full Text Available Agent-based models (ABMs have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1 efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2 conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  2. Metabolonote: A wiki-based database for managing hierarchical metadata of metabolome analyses

    Directory of Open Access Journals (Sweden)

    Takeshi eAra

    2015-04-01

    Full Text Available Metabolomics—technology for comprehensive detection of small molecules in an organism—lags behind the other omics in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata, existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called TogoMD, with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data, but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitates the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  3. Population genomic analyses based on 1 million SNPs in commercial egg layers.

    Directory of Open Access Journals (Sweden)

    Mahmood Gholami

    Full Text Available Identifying signatures of selection can provide valuable insight about the genes or genomic regions that are or have been under selective pressure, which can lead to a better understanding of genotype-phenotype relationships. A common strategy for selection signature detection is to compare samples from several populations and search for genomic regions with outstanding genetic differentiation. Wright's fixation index, FST, is a useful index for evaluation of genetic differentiation between populations. The aim of this study was to detect selective signatures between different chicken groups based on SNP-wise FST calculation. A total of 96 individuals of three commercial layer breeds and 14 non-commercial fancy breeds were genotyped with three different 600K SNP-chips. After filtering a total of 1 million SNPs were available for FST calculation. Averages of FST values were calculated for overlapping windows. Comparisons of these were then conducted between commercial egg layers and non-commercial fancy breeds, as well as between white egg layers and brown egg layers. Comparing non-commercial and commercial breeds resulted in the detection of 630 selective signatures, while 656 selective signatures were detected in the comparison between the commercial egg-layer breeds. Annotation of selection signature regions revealed various genes corresponding to productions traits, for which layer breeds were selected. Among them were NCOA1, SREBF2 and RALGAPA1 associated with reproductive traits, broodiness and egg production. Furthermore, several of the detected genes were associated with growth and carcass traits, including POMC, PRKAB2, SPP1, IGF2, CAPN1, TGFb2 and IGFBP2. Our approach demonstrates that including different populations with a specific breeding history can provide a unique opportunity for a better understanding of farm animal selection.

  4. Performance Analyses of Counter-Flow Closed Wet Cooling Towers Based on a Simplified Calculation Method

    Directory of Open Access Journals (Sweden)

    Xiaoqing Wei

    2017-02-01

    Full Text Available As one of the most widely used units in water cooling systems, the closed wet cooling towers (CWCTs have two typical counter-flow constructions, in which the spray water flows from the top to the bottom, and the moist air and cooling water flow in the opposite direction vertically (parallel or horizontally (cross, respectively. This study aims to present a simplified calculation method for conveniently and accurately analyzing the thermal performance of the two types of counter-flow CWCTs, viz. the parallel counter-flow CWCT (PCFCWCT and the cross counter-flow CWCT (CCFCWCT. A simplified cooling capacity model that just includes two characteristic parameters is developed. The Levenberg–Marquardt method is employed to determine the model parameters by curve fitting of experimental data. Based on the proposed model, the predicted outlet temperatures of the process water are compared with the measurements of a PCFCWCT and a CCFCWCT, respectively, reported in the literature. The results indicate that the predicted values agree well with the experimental data in previous studies. The maximum absolute errors in predicting the process water outlet temperatures are 0.20 and 0.24 °C for the PCFCWCT and CCFCWCT, respectively. These results indicate that the simplified method is reliable for performance prediction of counter-flow CWCTs. Although the flow patterns of the two towers are different, the variation trends of thermal performance are similar to each other under various operating conditions. The inlet air wet-bulb temperature, inlet cooling water temperature, air flow rate, and cooling water flow rate are crucial for determining the cooling capacity of a counter-flow CWCT, while the cooling tower effectiveness is mainly determined by the flow rates of air and cooling water. Compared with the CCFCWCT, the PCFCWCT is much more applicable in a large-scale cooling water system, and the superiority would be amplified when the scale of water

  5. Analyses of the soil surface dynamic of South African Kalahari salt pans based on hyperspectral and multitemporal data

    Science.gov (United States)

    Milewski, Robert; Chabrillat, Sabine; Behling, Robert; Mielke, Christian; Schleicher, Anja Maria; Guanter, Luis

    2016-04-01

    The consequences of climate change represent a major threat to sustainable development and growth in Southern Africa. Understanding the impact on the geo- and biosphere is therefore of great importance in this particular region. In this context the Kalahari salt pans (also known as playas or sabkhas) and their peripheral saline and alkaline habitats are an ecosystem of major interest. They are very sensitive to environmental conditions, and as thus hydrological, mineralogical and ecological responses to climatic variations can be analysed. Up to now the soil composition of salt pans in this area have been only assessed mono-temporally and on a coarse regional scale. Furthermore, the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. High spectral resolution remote sensing can estimate evaporite content and mineralogy of soils based on the analyses of the surface reflectance properties within the Visible-Near InfraRed (VNIR 400-1000 nm) and Short-Wave InfraRed (SWIR 1000-2500 nm) regions. In these wavelength regions major chemical components of the soil interact with the electromagnetic radiation and produce characteristic absorption features that can be used to derive the properties of interest. Although such techniques are well established for the laboratory and field scale, the potential of current (Hyperion) and upcoming spaceborne sensors such as EnMAP for quantitative mineralogical and salt spectral mapping is still to be demonstrated. Combined with hyperspectral methods, multitemporal remote sensing techniques allow us to derive the recent dynamic of these salt pans and link the mineralogical analysis of the pan surface to major physical processes in these dryland environments. In this study we focus on the analyses of the Namibian Omongwa salt pans based on satellite hyperspectral imagery and multispectral time-series data. First, a change detection analysis is applied using the Iterative

  6. Analyses of integrated aircraft cabin contaminant monitoring network based on Kalman consensus filter.

    Science.gov (United States)

    Wang, Rui; Li, Yanxiao; Sun, Hui; Chen, Zengqiang

    2017-11-01

    The modern civil aircrafts use air ventilation pressurized cabins subject to the limited space. In order to monitor multiple contaminants and overcome the hypersensitivity of the single sensor, the paper constructs an output correction integrated sensor configuration using sensors with different measurement theories after comparing to other two different configurations. This proposed configuration works as a node in the contaminant distributed wireless sensor monitoring network. The corresponding measurement error models of integrated sensors are also proposed by using the Kalman consensus filter to estimate states and conduct data fusion in order to regulate the single sensor measurement results. The paper develops the sufficient proof of the Kalman consensus filter stability when considering the system and the observation noises and compares the mean estimation and the mean consensus errors between Kalman consensus filter and local Kalman filter. The numerical example analyses show the effectiveness of the algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Toxicity testing and chemical analyses of recycled fibre-based paper for food contact

    DEFF Research Database (Denmark)

    Binderup, Mona-Lise; Pedersen, Gitte Alsing; Vinggaard, Anne

    2002-01-01

    of different qualities as food-contact materials and to Perform a preliminary evaluation of their suitability from a safety point of view, and, second, to evaluate the use of different in vitro toxicity tests for screening of paper and board. Paper produced from three different categories of recycled fibres (B...... of the paper products were extracted with either 99% ethanol or water. Potential migrants in the extracts were identified and semiquantified by GC-1R-MS or GC-HRMS. In parallel to the chemical analyses, a battery of four different in vitro toxicity tests with different endpoints were applied to the same...... was less cytotoxic than the extracts prepared from paper made from recycled fibres, and extracts prepared from C was the most cytotoxic. None of the extracts showed mutagenic activity No conclusion about the oestrogenic activity could be made, because all extracts were cytotoxic to the test organism (yeast...

  8. A case study of GWE satellite data impact on GLA assimilation analyses of two ocean cyclones

    Science.gov (United States)

    Gallimore, R. G.; Johnson, D. R.

    1986-01-01

    The effects of the Global Weather Experiment (GWE) data obtained on January 18-20, 1979 on Goddard Laboratory for Atmospheres assimilation analyses of simultaneous cyclones in the western Pacific and Atlantic oceans are examined. The ability of satellite data within assimilation models to determine the baroclinic structures of developing extratropical cyclones is evaluated. The impact of the satellite data on the amplitude and phase of the temperature structure within the storm domain, potential energy, and baroclinic growth rate is studied. The GWE data are compared with Data Systems Test results. It is noted that it is necessary to characterize satellite effects on the baroclinic structure of cyclone waves which degrade numerical weather predictions of cyclogenesis.

  9. Analysing Amazonian forest productivity using a new individual and trait-based model (TFS v.1)

    Science.gov (United States)

    Fyllas, N. M.; Gloor, E.; Mercado, L. M.; Sitch, S.; Quesada, C. A.; Domingues, T. F.; Galbraith, D. R.; Torre-Lezama, A.; Vilanova, E.; Ramírez-Angulo, H.; Higuchi, N.; Neill, D. A.; Silveira, M.; Ferreira, L.; Aymard C., G. A.; Malhi, Y.; Phillips, O. L.; Lloyd, J.

    2014-07-01

    Repeated long-term censuses have revealed large-scale spatial patterns in Amazon basin forest structure and dynamism, with some forests in the west of the basin having up to a twice as high rate of aboveground biomass production and tree recruitment as forests in the east. Possible causes for this variation could be the climatic and edaphic gradients across the basin and/or the spatial distribution of tree species composition. To help understand causes of this variation a new individual-based model of tropical forest growth, designed to take full advantage of the forest census data available from the Amazonian Forest Inventory Network (RAINFOR), has been developed. The model allows for within-stand variations in tree size distribution and key functional traits and between-stand differences in climate and soil physical and chemical properties. It runs at the stand level with four functional traits - leaf dry mass per area (Ma), leaf nitrogen (NL) and phosphorus (PL) content and wood density (DW) varying from tree to tree - in a way that replicates the observed continua found within each stand. We first applied the model to validate canopy-level water fluxes at three eddy covariance flux measurement sites. For all three sites the canopy-level water fluxes were adequately simulated. We then applied the model at seven plots, where intensive measurements of carbon allocation are available. Tree-by-tree multi-annual growth rates generally agreed well with observations for small trees, but with deviations identified for larger trees. At the stand level, simulations at 40 plots were used to explore the influence of climate and soil nutrient availability on the gross (ΠG) and net (ΠN) primary production rates as well as the carbon use efficiency (CU). Simulated ΠG, ΠN and CU were not associated with temperature. On the other hand, all three measures of stand level productivity were positively related to both mean annual precipitation and soil nutrient status

  10. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Voxel-based analyses of gray/white matter volume and diffusion tensor data in major depression. Presidential award proceedings

    International Nuclear Information System (INIS)

    Abe, Osamu; Yamasue, Hidenori; Kasai, Kiyoto

    2008-01-01

    Previous neuroimaging studies have revealed that frontolimbic dysfunction may contribute to the pathophysiology of major depressive disorder. We used voxel-based analysis to simultaneously elucidate regional changes in gray/white matter volume, mean diffusivity (MD), and fractional anisotropy (FA) in the central nervous system of patients with unipolar major depression. We studied 21 right-handed patients and 42 age- and gender-matched right-handed normal subjects without central nervous system disorders. All image processing and statistical analyses were performed using SPM5 software. Local areas showing significant gray matter volume reduction in depressive patients compared with normal controls were observed in the right parahippocampal gyrus, hippocampus, bilateral middle frontal gyri, bilateral anterior cingulate cortices, left parietal and occipital lobes, and right superior temporal gyrus. Local areas showing increased mean diffusivity in depressive patients were observed in the bilateral parahippocampal gyri, hippocampus, pons, cerebellum, left frontal and temporal lobes, and right frontal lobe. There was no significant difference between the 2 groups for fractional anisotropy and white matter volume in the entire brain. Although there was no local area in which FA and MD were significantly correlated with disease severity, FA tended to correlate negatively with depression days (total accumulated days in depressive state) in the right anterior cingulate and the left frontal white matter (FDR-corrected P=0.055 for both areas). These results suggest that the frontolimbic neural circuit may play an important role in the neuropathology of patients with major depression. (author)

  12. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    Science.gov (United States)

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  13. Phylogenetic tree based on complete genomes using fractal and correlation analyses without sequence alignment

    Directory of Open Access Journals (Sweden)

    Zu-Guo Yu

    2006-06-01

    Full Text Available The complete genomes of living organisms have provided much information on their phylogenetic relationships. Similarly, the complete genomes of chloroplasts have helped resolve the evolution of this organelle in photosynthetic eukaryotes. In this review, we describe two algorithms to construct phylogenetic trees based on the theories of fractals and dynamic language using complete genomes. These algorithms were developed by our research group in the past few years. Our distance-based phylogenetic tree of 109 prokaryotes and eukaryotes agrees with the biologists' "tree of life" based on the 16S-like rRNA genes in a majority of basic branchings and most lower taxa. Our phylogenetic analysis also shows that the chloroplast genomes are separated into two major clades corresponding to chlorophytes s.l. and rhodophytes s.l. The interrelationships among the chloroplasts are largely in agreement with the current understanding on chloroplast evolution.

  14. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  15. Revised age of deglaciation of Lake Emma based on new radiocarbon and macrofossil analyses

    Science.gov (United States)

    Elias, S.A.; Carrara, P.E.; Toolin, L.J.; Jull, A.J.T.

    1991-01-01

    Previous radiocarbon ages of detrital moss fragments in basal organic sediments of Lake Emma indicated that extensive deglaciation of the San Juan Mountains occurred prior to 14,900 yr B.P. (Carrara et al., 1984). Paleoecological analyses of insect and plant macrofossils from these basal sediments cast doubt on the reliability of the radiocarbon ages. Subsequent accelerator radiocarbon dates of insect fossils and wood fragments indicate an early Holocene age, rather than a late Pleistocene age, for the basal sediments of Lake Emma. These new radiocarbon ages suggest that by at least 10,000 yr B.P. deglaciation of the San Juan Mountains was complete. The insect and plant macrofossils from the basal organic sediments indicate a higher-than-present treeline during the early Holocene. The insect assemblages consisted of about 30% bark beetles, which contrasts markedly with the composition of insects from modern lake sediments and modern specimens collected in the Lake Emma cirque, in which bark beetles comprise only about 3% of the assemblages. In addition, in the fossil assemblages there were a number of flightless insect species (not subject to upslope transport by wind) indicative of coniferous forest environments. These insects were likewise absent in the modern assemblage. ?? 1991.

  16. Is autoimmunology a discipline of its own? A big data-based bibliometric and scientometric analyses.

    Science.gov (United States)

    Watad, Abdulla; Bragazzi, Nicola Luigi; Adawi, Mohammad; Amital, Howard; Kivity, Shaye; Mahroum, Naim; Blank, Miri; Shoenfeld, Yehuda

    2017-06-01

    Autoimmunology is a super-specialty of immunology specifically dealing with autoimmune disorders. To assess the extant literature concerning autoimmune disorders, bibliometric and scientometric analyses (namely, research topics/keywords co-occurrence, journal co-citation, citations, and scientific output trends - both crude and normalized, authors network, leading authors, countries, and organizations analysis) were carried out using open-source software, namely, VOSviewer and SciCurve. A corpus of 169,519 articles containing the keyword "autoimmunity" was utilized, selecting PubMed/MEDLINE as bibliographic thesaurus. Journals specifically devoted to autoimmune disorders were six and covered approximately 4.15% of the entire scientific production. Compared with all the corpus (from 1946 on), these specialized journals have been established relatively few decades ago. Top countries were the United States, Japan, Germany, United Kingdom, Italy, China, France, Canada, Australia, and Israel. Trending topics are represented by the role of microRNAs (miRNAs) in the ethiopathogenesis of autoimmune disorders, contributions of genetics and of epigenetic modifications, role of vitamins, management during pregnancy and the impact of gender. New subsets of immune cells have been extensively investigated, with a focus on interleukin production and release and on Th17 cells. Autoimmunology is emerging as a new discipline within immunology, with its own bibliometric properties, an identified scientific community and specifically devoted journals.

  17. Shielding analysis method applied to nuclear ship 'MUTSU' and its evaluation based on experimental analyses

    International Nuclear Information System (INIS)

    Yamaji, Akio; Miyakoshi, Jun-ichi; Iwao, Yoshiaki; Tsubosaka, Akira; Saito, Tetsuo; Fujii, Takayoshi; Okumura, Yoshihiro; Suzuoki, Zenro; Kawakita, Takashi.

    1984-01-01

    Procedures of shielding analysis are described which were used for the shielding modification design of the Nuclear Ship ''MUTSU''. The calculations of the radiation distribution on board were made using Sn codes ANISN and TWOTRAN, a point kernel code QAD and a Monte Carlo code MORSE. The accuracies of these calculations were investigated through the analysis of various shielding experiments: the shield tank experiment of the Nuclear Ship ''Otto Hahn'', the shielding mock-up experiment for ''MUTSU'' performed in JRR-4, the shielding benchmark experiment using the 16 N radiation facility of AERE Harwell and the shielding effect experiment of the ship structure performed in the training ship ''Shintoku-Maru''. The values calculated by the ANISN agree with the data measured at ''Otto Hahn'' within a factor of 2 for fast neutrons and within a factor of 3 for epithermal and thermal neutrons. The γ-ray dose rates calculated by the QAD agree with the measured values within 30% for the analysis of the experiment in JRR-4. The design values for ''MUTSU'' were determined in consequence of these experimental analyses. (author)

  18. Applicability study of deuterium excess in bottled water life cycle analyses

    Directory of Open Access Journals (Sweden)

    Mihael Brenčič

    2014-12-01

    Full Text Available Paper explores the possible use of d‑excess in the investigation of bottled water. Based on the data set from Brencic and Vreca’s paper (2006. Identification of sources and production processes of bottled waters by stable hydrogen and oxygen isotope ratios, d‑excess values were statistically analysed and compared among different bottled water groups and different bottlers. The bottled water life cycle in relation to d‑excess values was also theoretically identified. Descriptive statistics and one-way ANOVA showed no significant differences among the groups. Differences were detected in the shape of empirical distributions. Groups of still and flavoured waters have similar shapes, but sparkling waters differed to the others. Two distinctive groups of bottlers could be discerned. The first group is represented by bottlers with a high range of d‑excess (from 7.7 ‰ to 18.6 ‰ with average of 12.0 ‰ exploring waters originating from the aquifers rich in highly mineralised groundwater and relatively high concentrations of CO2 gas. The second group is represented by bottlers using groundwater from relatively shallow aquifers. Their d‑excess values have characteristics similar to the local precipitation (from 7.8 ‰ to 14.3 ‰ with average of 10.3 ‰. More frequent sampling and better knowledge of production phases are needed to improve usage of isotope fingerprint for authentication of bottled waters.

  19. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  20. Handbook of methods for risk-based analyses of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Samanta, P.K.; Kim, I.S. [Brookhaven National Lab., Upton, NY (United States); Mankamo, T. [Avaplan Oy, Espoo (Finland); Vesely, W.E. [Science Applications International Corp., Dublin, OH (United States)

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  1. Group analyses of connectivity-based cortical parcellation using repeated k-means clustering

    NARCIS (Netherlands)

    Nanetti, Luca; Cerliani, Leonardo; Gazzola, Valeria; Renken, Remco; Keysers, Christian

    2009-01-01

    K-means clustering has become a popular tool for connectivity-based cortical segmentation using Diffusion Weighted Imaging (DWI) data. A sometimes ignored issue is, however, that the output of the algorithm depends on the initial placement of starting points, and that different sets of starting

  2. A case study of discordant overlapping meta-analyses: vitamin d supplements and fracture.

    Directory of Open Access Journals (Sweden)

    Mark J Bolland

    Full Text Available BACKGROUND: Overlapping meta-analyses on the same topic are now very common, and discordant results often occur. To explore why discordant results arise, we examined a common topic for overlapping meta-analyses- vitamin D supplements and fracture. METHODS AND FINDINGS: We identified 24 meta-analyses of vitamin D (with or without calcium and fracture in a PubMed search in October 2013, and analysed a sample of 7 meta-analyses in the highest ranking general medicine journals. We used the AMSTAR tool to assess the quality of the meta-analyses, and compared their methodologies, analytic techniques and results. Applying the AMSTAR tool suggested the meta-analyses were generally of high quality. Despite this, there were important differences in trial selection, data extraction, and analytical methods that were only apparent after detailed assessment. 25 trials were included in at least one meta-analysis. Four meta-analyses included all eligible trials according to the stated inclusion and exclusion criteria, but the other 3 meta-analyses "missed" between 3 and 8 trials, and 2 meta-analyses included apparently ineligible trials. The relative risks used for individual trials differed between meta-analyses for total fracture in 10 of 15 trials, and for hip fracture in 6 of 12 trials, because of different outcome definitions and analytic approaches. The majority of differences (11/16 led to more favourable estimates of vitamin D efficacy compared to estimates derived from unadjusted intention-to-treat analyses using all randomised participants. The conclusions of the meta-analyses were discordant, ranging from strong statements that vitamin D prevents fractures to equally strong statements that vitamin D without calcium does not prevent fractures. CONCLUSIONS: Substantial differences in trial selection, outcome definition and analytic methods between overlapping meta-analyses led to discordant estimates of the efficacy of vitamin D for fracture prevention

  3. Epidemiology, quality and reporting characteristics of meta-analyses of observational studies published in Chinese journals.

    Science.gov (United States)

    Zhang, Zhe-wen; Cheng, Juan; Liu, Zhuan; Ma, Ji-chun; Li, Jin-long; Wang, Jing; Yang, Ke-hu

    2015-12-07

    The aim of this study was to examine the epidemiological and reporting characteristics as well as the methodological quality of meta-analyses (MAs) of observational studies published in Chinese journals. 5 Chinese databases were searched for MAs of observational studies published from January 1978 to May 2014. Data were extracted into Excel spreadsheets, and Meta-analysis of Observational Studies in Epidemiology (MOOSE) and Assessment of Multiple Systematic Reviews (AMSTAR) checklists were used to assess reporting characteristics and methodological quality, respectively. A total of 607 MAs were included. Only 52.2% of the MAs assessed the quality of the included primary studies, and the retrieval information was not comprehensive in more than half (85.8%) of the MAs. In addition, 50 (8.2%) MAs did not search any Chinese databases, while 126 (20.8%) studies did not search any English databases. Approximately 41.2% of the MAs did not describe the statistical methods in sufficient details, and most (95.5%) MAs did not report on conflicts of interest. However, compared with the before publication of the MOOSE Checklist, the quality of reporting improved significantly for 20 subitems after publication of the MOOSE Checklist, and 7 items of the included MAs demonstrated significant improvement after publication of the AMSTAR Checklist (pstudies have been published in Chinese journals, the reporting quality is questionable. Thus, there is an urgent need to increase the use of reporting guidelines and methodological tools in China; we recommend that Chinese journals adopt the MOOSE and AMSTAR criteria. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Treatment algorithm based on the multivariate survival analyses in patients with advanced hepatocellular carcinoma treated with trans-arterial chemoembolization.

    Directory of Open Access Journals (Sweden)

    Hasmukh J Prajapati

    Full Text Available To develop the treatment algorithm from multivariate survival analyses (MVA in patients with Barcelona clinic liver cancer (BCLC C (advanced Hepatocellular carcinoma (HCC patients treated with Trans-arterial Chemoembolization (TACE.Consecutive unresectable and non-tranplantable patients with advanced HCC, who received DEB TACE were studied. A total of 238 patients (mean age, 62.4yrs was included in the study. Survivals were analyzed according to different parameters from the time of the 1st DEB TACE. Kaplan Meier and Cox Proportional Hazard model were used for survival analysis. The SS was constructed from MVA and named BCLC C HCC Prognostic (BCHP staging system (SS.Overall median survival (OS was 16.2 months. In HCC patients with venous thrombosis (VT of large vein [main portal vein (PV, right or left PV, hepatic vein, inferior vena cava] (22.7% versus small vein (segmental/subsegmental PV (9.7% versus no VT had OSs of 6.4 months versus 20 months versus 22.8 months respectively (p<0.001. On MVA, the significant independent prognostic factors (PFs of survival were CP class, eastern cooperative oncology group (ECOG performance status (PS, single HCC<5 cm, site of VT, metastases, serum creatinine and serum alpha-feto protein. Based on these PFs, the BCHP staging system was constructed. The OSs of stages I, II and III were 28.4 months, 11.8 months and 2.4 months accordingly (p<0.001. The treatment plan was proposed according to the different stages.On MVA of patients with advanced HCC treated with TACE, significant independent prognostic factors (PFs of survival were CP class, ECOG PS, single HCC<5 cm or others, site of VT, metastases, serum creatinine and serum alpha-feto protein. New BCHP SS was proposed based on MVA data to identify the suitable advanced HCC patients for TACE treatments.

  5. Data base management study

    Science.gov (United States)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  6. Geology of Southern Guinevere Planitia, Venus, based on analyses of Goldstone radar data

    International Nuclear Information System (INIS)

    Arvidson, R.E.; Plaut, J.J.; Jurgens, R.F.; Saunders, R.S.; Slade, M.A.

    1989-01-01

    The ensemble of 41 backscatter images of Venus acquired by the S Band (12.6 cm) Goldstone radar system covers approx. 35 million km and includes the equatorial portion of Guinevere Planitia, Navka Planitia, Heng-O Chasma, and Tinatin Planitia, and parts of Devana Chasma and Phoebe Regio. The images and associated altimetry data combine relatively high spatial resolution (1 to 10 km) with small incidence angles (less than 10 deg) for regions not covered by either Venera Orbiter or Arecibo radar data. Systematic analyses of the Goldstone data show that: (1) Volcanic plains dominate, including groups of small volcanic constructs, radar bright flows on a NW-SE arm of Phoebe Regio and on Ushas Mons and circular volcano-tectonic depressions; (2) Some of the regions imaged by Goldstone have high radar cross sections, including the flows on Ushas Mons and the NW-SE arm of Phoebe Regio, and several other unnamed hills, ridged terrains, and plains areas; (3) A 1000 km diameter multiringed structure is observed and appears to have a morphology not observed in Venera data (The northern section corresponds to Heng-O Chasma); (4) A 150 km wide, 2 km deep, 1400 km long rift valley with upturned flanks is located on the western flank of Phoebe Regio and extends into Devana Chasma; (5) A number of structures can be discerned in the Goldstone data, mainly trending NW-SE and NE-SW, directions similar to those discerned in Pioneer-Venus topography throughout the equatorial region; and (6) The abundance of circular and impact features is similar to the plains global average defined from Venera and Arecibo data, implying that the terrain imaged by Goldstone has typical crater retention ages, measured in hundreds of millions of years. The rate of resurfacing is less than or equal to 4 km/Ga

  7. Intra-specific genetic relationship analyses of Elaeagnus angustifolia based on RP-HPLC biochemical markers

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Elaeagnus angustifolia Linn. has various ecological, medicinal and economical uses. An approach was established using RP-HPLC (reversed-phase high-performance liquid chromatography) to classify and analyse the intra-specific genetic relationships of seventeen populations of E. angustifolia, collected from the Xinjiang areas of China. Chromatograms of alcohol-soluble proteins produced by seventeen populations ofE. angustifolia, were compared. Each chromatogram of alcohol-soluble proteins came from a single seed of one wild plant only. The results showed that when using a Waters Delta Pak. C18, 5 μm particle size reversed phase column (150 mm×3.9 mm), a linear gradient of 25%~60% solvent B with flow rate of 1 ml/min and run time of 67 min, the chromatography yielded optimum separation ofE. angustifolia alcohol-soluble proteins. Representative peaks in each population were chosen according to peak area and occurrence in every seed. The converted data on the elution peaks of each population were different and could be used to represent those populations. GSC (genetic similarity coefficients) of 41% to 62% showed a medium degree of genetic diversity among the populations in these eco-areas. Cluster analysis showed that the seventeen populations ofE. angustifolia could be divided into six clusters at the GSC=0.535 level and indicated the general and unique biochemical markers of these clusters. We suggest that E. angustifolia distribution in these eco-areas could be classified into six variable species. RP-HPLC was shown to be a rapid, repeatable and reliable method for E. angustifolia classification and identification and for analysis of genetic diversity.

  8. Improving the safety of a body composition analyser based on the PGNAA method

    Energy Technology Data Exchange (ETDEWEB)

    Miri-Hakimabad, Hashem; Izadi-Najafabadi, Reza; Vejdani-Noghreiyan, Alireza; Panjeh, Hamed [FUM Radiation Detection And Measurement Laboratory, Ferdowsi University of Mashhad (Iran, Islamic Republic of)

    2007-12-15

    The {sup 252}Cf radioisotope and {sup 241}Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. {sup 252}Cf and {sup 241}Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the {sup 252}Cf and {sup 241}Am-Be sources.

  9. Ecogeographical associations between climate and human body composition: analyses based on anthropometry and skinfolds.

    Science.gov (United States)

    Wells, Jonathan C K

    2012-02-01

    In the 19th century, two "ecogeographical rules" were proposed hypothesizing associations of climate with mammalian body size and proportions. Data on human body weight and relative leg length support these rules; however, it is unknown whether such associations are attributable to lean tissue (the heat-producing component) or fat (energy stores). Data on weight, height, and two skinfold thickness were obtained from the literature for 137 nonindustrialized populations, providing 145 male and 115 female individual samples. A variety of indices of adiposity and lean mass were analyzed. Preliminary analyses indicated secular increases in skinfolds in men but not women, and associations of age and height with lean mass in both sexes. Decreasing annual temperature was associated with increasing body mass index (BMI), and increasing triceps but not subscapular skinfold. After adjusting for skinfolds, decreasing temperature remained associated with increasing BMI. These results indicate that colder environments favor both greater peripheral energy stores, and greater lean mass. Contrasting results for triceps and subscapular skinfolds might be due to adaptive strategies either constraining central adiposity in cold environments to reduce cardiovascular risk, or favoring central adiposity in warmer environments to maintain energetic support of the immune system. Polynesian populations were analyzed separately and contradicted all of the climate trends, indicating support for the hypothesis that they are cold-adapted despite occupying a tropical region. It is unclear whether such associations emerge through natural selection or through trans-generational and life-course plasticity. These findings nevertheless aid understanding of the wide variability in human physique and adiposity. Copyright © 2011 Wiley Periodicals, Inc.

  10. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 3

    International Nuclear Information System (INIS)

    1983-01-01

    Critical review of the analyses of the German Risk Assessment Study on Nuclear Power Plants (DRS) concerning the reliability of the containment under accident conditions and the conditions of fission product release (transport and distribution in the environment). Main point of interest in this context is an explosion in the steam section and its impact on the containment. Critical comments are given on the models used in the DRS for determining the accident consequences. The analyses made deal with the mathematical models and database for propagation calculations, the methods of dose computation and assessment of health hazards, and the modelling of protective and safety measures. Social impacts of reactor accidents are also considered. (RF) [de

  11. Gene Set Analyses of Genome-Wide Association Studies on 49 Quantitative Traits Measured in a Single Genetic Epidemiology Dataset

    Directory of Open Access Journals (Sweden)

    Jihye Kim

    2013-09-01

    Full Text Available Gene set analysis is a powerful tool for interpreting a genome-wide association study result and is gaining popularity these days. Comparison of the gene sets obtained for a variety of traits measured from a single genetic epidemiology dataset may give insights into the biological mechanisms underlying these traits. Based on the previously published single nucleotide polymorphism (SNP genotype data on 8,842 individuals enrolled in the Korea Association Resource project, we performed a series of systematic genome-wide association analyses for 49 quantitative traits of basic epidemiological, anthropometric, or blood chemistry parameters. Each analysis result was subjected to subsequent gene set analyses based on Gene Ontology (GO terms using gene set analysis software, GSA-SNP, identifying a set of GO terms significantly associated to each trait (pcorr < 0.05. Pairwise comparison of the traits in terms of the semantic similarity in their GO sets revealed surprising cases where phenotypically uncorrelated traits showed high similarity in terms of biological pathways. For example, the pH level was related to 7 other traits that showed low phenotypic correlations with it. A literature survey implies that these traits may be regulated partly by common pathways that involve neuronal or nerve systems.

  12. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  13. Power plant economy of scale and cost trends: further analyses and review of empirical studies

    International Nuclear Information System (INIS)

    Fisher, C.F. Jr.; Paik, S.; Schriver, W.R.

    1986-07-01

    Multiple regression analyses were performed on capital cost data for nuclear and coal-fired power plants in an extension of an earlier study which indicated that nuclear units completed prior to the accident at Three-Mile Island (TMI) have no economy of scale, and that units completed after that event have a weak economy of scale (scaling exponent of about 0.81). The earlier study also indicated that the scaling exponent for coal-fired units is about 0.92, compared with conceptual models which project scaling exponents in a range from about 0.5 to 0.9. Other empirical studies have indicated poor economy of scale, but a large range of cost-size scaling exponents has been reported. In the present study, the results for nuclear units indicate a scaling exponent of about 0.94 but with no economy of scale for large units, that a first unit costs 17% more than a second unit, that a unit in the South costs 20% less than others, that a unit completed after TMI costs 33% more than one completed before TMI, and that costs are increasing at 9.3% per year. In the present study, the results for coal-fired units indicate a scaling exponent of 0.93 but with better scaling economy in the larger units, that a first unit costs 38.5% more, a unit in the South costs 10% less, flue-gas desulfurization units cost 23% more, and that costs are increasing at 4% per year

  14. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  15. Application of the CALUX bioassay for epidemiological study. Analyses of Belgian human plasma

    Energy Technology Data Exchange (ETDEWEB)

    Wouwe, N. van; Debacker, N.; Sasse, A. [Scientific Institute of Public Health, Brussels (BE)] (and others)

    2004-09-15

    The CALUX bioassay is a promising screening method for the detection of dioxin-like compounds. The observed good sensitivity, low number of false negative results as well as the good correlations with the GC-HRMS TEQ-values in case of feed and food analyses allow this method to climb in the first assessment methods' scale. The low amount of sample needed in addition to those latest advantages suggest that the CALUX bioassay could be a good screening method for epidemiological studies. The Belgian epidemiological study concerning the possible effect of the dioxin incident on the body burden of the Belgian population was an opportunity to test this method in comparison to the gold reference one: the GC-HRMS. The first part of this abstract presents epidemiological parameters (sensibility, specificity,) of the CALUX bioassay using CALUX TEQ-values as estimators of the TEQ-values of the 17 PCDD/Fs. The second part examines epidemiological determinants observed for CALUX and GCHRMS TEQ-values.

  16. Family structure and posttraumatic stress reactions: a longitudinal study using multilevel analyses

    Science.gov (United States)

    2011-01-01

    Background There is limited research on the relevance of family structures to the development and maintenance of posttraumatic stress following disasters. We longitudinally studied the effects of marital and parental statuses on posttraumatic stress reactions after the 2004 Southeast Asian tsunami and whether persons in the same households had more shared stress reactions than others. Method The study included a tourist population of 641 Norwegian adult citizens, many of them from families with children. We measured posttraumatic stress symptoms with the Impact of Event Scale-Revised at 6 months and 2 years post-disaster. Analyses included multilevel methods with mixed effects models. Results Results showed that neither marital nor parental status was significantly related to posttraumatic stress. At both assessments, adults living in the same household reported levels of posttraumatic stress that were more similar to one another than adults who were not living together. Between households, disaster experiences were closely related to the variance in posttraumatic stress symptom levels at both assessments. Within households, however, disaster experiences were less related to the variance in symptom level at 2 years than at 6 months. Conclusions These results indicate that adult household members may influence one another's posttraumatic stress reactions as well as their interpretations of the disaster experiences over time. Our findings suggest that multilevel methods may provide important information about family processes after disasters. PMID:22171549

  17. Wind Power Forecasting Error Frequency Analyses for Operational Power System Studies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, A.; Hodge, B. M.; Milligan, M.

    2012-08-01

    The examination of wind power forecasting errors is crucial for optimal unit commitment and economic dispatch of power systems with significant wind power penetrations. This scheduling process includes both renewable and nonrenewable generators, and the incorporation of wind power forecasts will become increasingly important as wind fleets constitute a larger portion of generation portfolios. This research considers the Western Wind and Solar Integration Study database of wind power forecasts and numerical actualizations. This database comprises more than 30,000 locations spread over the western United States, with a total wind power capacity of 960 GW. Error analyses for individual sites and for specific balancing areas are performed using the database, quantifying the fit to theoretical distributions through goodness-of-fit metrics. Insights into wind-power forecasting error distributions are established for various levels of temporal and spatial resolution, contrasts made among the frequency distribution alternatives, and recommendations put forth for harnessing the results. Empirical data are used to produce more realistic site-level forecasts than previously employed, such that higher resolution operational studies are possible. This research feeds into a larger work of renewable integration through the links wind power forecasting has with various operational issues, such as stochastic unit commitment and flexible reserve level determination.

  18. Who runs public health? A mixed-methods study combining qualitative and network analyses.

    Science.gov (United States)

    Oliver, Kathryn; de Vocht, Frank; Money, Annemarie; Everett, Martin

    2013-09-01

    Persistent health inequalities encourage researchers to identify new ways of understanding the policy process. Informal relationships are implicated in finding evidence and making decisions for public health policy (PHP), but few studies use specialized methods to identify key actors in the policy process. We combined network and qualitative data to identify the most influential individuals in PHP in a UK conurbation and describe their strategies to influence policy. Network data were collected by asking for nominations of powerful and influential people in PHP (n = 152, response rate 80%), and 23 semi-structured interviews were analysed using a framework approach. The most influential PHP makers in this conurbation were mid-level managers in the National Health Service and local government, characterized by managerial skills: controlling policy processes through gate keeping key organizations, providing policy content and managing selected experts and executives to lead on policies. Public health professionals and academics are indirectly connected to policy via managers. The most powerful individuals in public health are managers, not usually considered targets for research. As we show, they are highly influential through all stages of the policy process. This study shows the importance of understanding the daily activities of influential policy individuals.

  19. Family structure and posttraumatic stress reactions: a longitudinal study using multilevel analyses

    Directory of Open Access Journals (Sweden)

    Nygaard Egil

    2011-12-01

    Full Text Available Abstract Background There is limited research on the relevance of family structures to the development and maintenance of posttraumatic stress following disasters. We longitudinally studied the effects of marital and parental statuses on posttraumatic stress reactions after the 2004 Southeast Asian tsunami and whether persons in the same households had more shared stress reactions than others. Method The study included a tourist population of 641 Norwegian adult citizens, many of them from families with children. We measured posttraumatic stress symptoms with the Impact of Event Scale-Revised at 6 months and 2 years post-disaster. Analyses included multilevel methods with mixed effects models. Results Results showed that neither marital nor parental status was significantly related to posttraumatic stress. At both assessments, adults living in the same household reported levels of posttraumatic stress that were more similar to one another than adults who were not living together. Between households, disaster experiences were closely related to the variance in posttraumatic stress symptom levels at both assessments. Within households, however, disaster experiences were less related to the variance in symptom level at 2 years than at 6 months. Conclusions These results indicate that adult household members may influence one another's posttraumatic stress reactions as well as their interpretations of the disaster experiences over time. Our findings suggest that multilevel methods may provide important information about family processes after disasters.

  20. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    In September 2008 a large-scale testing operation (referred to as the INL-2 test) was performed within a two-story building (PBF-632) at the Idaho National Laboratory (INL). The report “Operational Observations on the INL-2 Experiment” defines the seven objectives for this test and discusses the results and conclusions. This is further discussed in the introduction of this report. The INL-2 test consisted of five tests (events) in which a floor (level) of the building was contaminated with the harmless biological warfare agent simulant Bg and samples were taken in most, if not all, of the rooms on the contaminated floor. After the sampling, the building was decontaminated, and the next test performed. Judgmental samples and probabilistic samples were determined and taken during each test. Vacuum, wipe, and swab samples were taken within each room. The purpose of this report is to study an additional four topics that were not within the scope of the original report. These topics are: 1) assess the quantitative assumptions about the data being normally or log-normally distributed; 2) evaluate differences and quantify the sample to sample variability within a room and across the rooms; 3) perform geostatistical types of analyses to study spatial correlations; and 4) quantify the differences observed between surface types and sampling methods for each scenario and study the consistency across the scenarios. The following four paragraphs summarize the results of each of the four additional analyses. All samples after decontamination came back negative. Because of this, it was not appropriate to determine if these clearance samples were normally distributed. As Table 1 shows, the characterization data consists of values between and inclusive of 0 and 100 CFU/cm2 (100 was the value assigned when the number is too numerous to count). The 100 values are generally much bigger than the rest of the data, causing the data to be right skewed. There are also a significant

  1. Use of results of microbiological analyses for risk-based control of Listeria monocytogenes in marinated broiler legs.

    Science.gov (United States)

    Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura

    2008-02-10

    Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk

  2. Treatment of visceral leishmaniasis: model-based analyses on the spread of antimony-resistant L. donovani in Bihar, India.

    Directory of Open Access Journals (Sweden)

    Anette Stauch

    Full Text Available BACKGROUND: Pentavalent antimonials have been the mainstay of antileishmanial therapy for decades, but increasing failure rates under antimonial treatment have challenged further use of these drugs in the Indian subcontinent. Experimental evidence has suggested that parasites which are resistant against antimonials have superior survival skills than sensitive ones even in the absence of antimonial treatment. METHODS AND FINDINGS: We use simulation studies based on a mathematical L. donovani transmission model to identify parameters which can explain why treatment failure rates under antimonial treatment increased up to 65% in Bihar between 1980 and 1997. Model analyses suggest that resistance to treatment alone cannot explain the observed treatment failure rates. We explore two hypotheses referring to an increased fitness of antimony-resistant parasites: the additional fitness is (i disease-related, by causing more clinical cases (higher pathogenicity or more severe disease (higher virulence, or (ii is transmission-related, by increasing the transmissibility from sand flies to humans or vice versa. CONCLUSIONS: Both hypotheses can potentially explain the Bihar observations. However, increased transmissibility as an explanation appears more plausible because it can occur in the background of asymptomatically transmitted infection whereas disease-related factors would most probably be observable. Irrespective of the cause of fitness, parasites with a higher fitness will finally replace sensitive parasites, even if antimonials are replaced by another drug.

  3. Sensitivity analyses of woody species exposed to air pollution based on ecophysiological measurements.

    Science.gov (United States)

    Wen, Dazhi; Kuang, Yuanwen; Zhou, Guoyi

    2004-01-01

    Air pollution has been of a major problem in the Pearl River Delta of south China, particularly during the last two decades. Emissions of air pollutants from industries have already led to damages in natural communities and environments in a wide range of the Delta area. Leaf parameters such as chlorophyll fluorescence, leaf area (LA), dry weight (DW) and leaf mass per area (LMA) had once been used as specific indexes of environmental stress. This study aims to determine in situ if the daily variation of chlorophyll fluorescence and other ecophysiological parameters in five seedlings of three woody species, Ilex rotunda, Ficus microcarpa and Machilus chinensis, could be used alone or in combination with other measurements for sensitivity indexes to make diagnoses under air pollution stress and, hence, to choose the correct tree species for urban afforestation in the Delta area. Five seedlings of each species were transplanted in pot containers after their acclimation under shadowing conditions. Chlorophyll fluorescence measurements were made in situ by a portable fluorometer (OS-30, Opti-sciences, U.S.A). Ten random samples of leaves were picked from each species for LA measurements by area-meter (CI-203, CID, Inc., U.S.A). DW was determined after the leaf samples were dried to a constant weight at 65 degrees C. LMA was calculated as the ratio of DW/LA. Leaf N content was analyzed according to the Kjeldhal method, and the extraction of pigments was carried out according Lin et al. The daily mean Fv/Fm (Fv is the variable fluorescence and Fm is the maximum fluorescence) analysis showed that Ilex rotunda and Ficus microcarpa were more highly resistant to pollution stress, followed by Machilus chinensis, implying that the efficiency of photosystem II in I. rotunda was less affected by air pollutants than the other two species. Little difference in daily change of Fv/Fm in I. rotunda between the polluted and the clean site was also observed. However, a relatively large

  4. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  5. Theoretical and Empirical Analyses of an Improved Harmony Search Algorithm Based on Differential Mutation Operator

    Directory of Open Access Journals (Sweden)

    Longquan Yong

    2012-01-01

    Full Text Available Harmony search (HS method is an emerging metaheuristic optimization algorithm. In this paper, an improved harmony search method based on differential mutation operator (IHSDE is proposed to deal with the optimization problems. Since the population diversity plays an important role in the behavior of evolution algorithm, the aim of this paper is to calculate the expected population mean and variance of IHSDE from theoretical viewpoint. Numerical results, compared with the HSDE, NGHS, show that the IHSDE method has good convergence property over a test-suite of well-known benchmark functions.

  6. Gastric inhibitory polypeptide receptor: association analyses for obesity of several polymorphisms in large study groups

    Directory of Open Access Journals (Sweden)

    Rief Winfried

    2009-03-01

    Full Text Available Abstract Background Gastric inhibitory polypeptide (GIP is postulated to be involved in type 2 diabetes mellitus and obesity. It exerts its function through its receptor, GIPR. We genotyped three GIPR SNPs (rs8111428, rs2302382 and rs1800437 in German families with at least one obese index patient, two case-control studies and two cross-sectional population-based studies. Methods Genotyping was performed by MALDI-TOF, ARMS-PCR and RFLP. The family-study: 761 German families with at least one extremely obese child or adolescent (n = 1,041 and both parents (n = 1,522. Case-control study: (a German obese children (n = 333 and (b obese adults (n = 987 in comparison to 588 adult lean controls. The two cross-sectional population-based studies: KORA (n = 8,269 and SHIP (n = 4,310. Results We detected over-transmission of the A-allele of rs2302382 in the German families (pTDT-Test = 0.0089. In the combined case-control sample, we estimated an odd ratio of 1.54 (95%CI 1.09;2.19, pCA-Test = 0.014 for homozygotes of the rs2302382 A-allele compared to individuals with no A-allele. A similar trend was found in KORA where the rs2302382 A-allele led to an increase of 0.12 BMI units (p = 0.136. In SHIP, however, the A-allele of rs2302382 was estimated to contribute an average decrease of 0.27 BMI units (p-value = 0.031. Conclusion Our data suggest a potential relevance of GIPR variants for obesity. However, additional studies are warranted in light of the conflicting results obtained in one of the two population-based studies.

  7. Neural Network-Based Model for Landslide Susceptibility and Soil Longitudinal Profile Analyses

    DEFF Research Database (Denmark)

    Farrokhzad, F.; Barari, Amin; Choobbasti, A. J.

    2011-01-01

    The purpose of this study was to create an empirical model for assessing the landslide risk potential at Savadkouh Azad University, which is located in the rural surroundings of Savadkouh, about 5 km from the city of Pol-Sefid in northern Iran. The soil longitudinal profile of the city of Babol......, located 25 km from the Caspian Sea, also was predicted with an artificial neural network (ANN). A multilayer perceptron neural network model was applied to the landslide area and was used to analyze specific elements in the study area that contributed to previous landsliding events. The ANN models were...... studies in landslide susceptibility zonation....

  8. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  9. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  10. VALUE-BASED MEDICINE AND OPHTHALMOLOGY: AN APPRAISAL OF COST-UTILITY ANALYSES

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Sharma, Sanjay; Brown, Heidi; Smithen, Lindsay; Leeser, David B; Beauchamp, George

    2004-01-01

    ABSTRACT Purpose To ascertain the extent to which ophthalmologic interventions have been evaluated in value-based medicine format. Methods Retrospective literature review. Papers in the healthcare literature utilizing cost-utility analysis were reviewed by researchers at the Center for Value-Based Medicine, Flourtown, Pennsylvania. A literature review of papers addressing the cost-utility analysis of ophthalmologic procedures in the United States over a 12-year period from 1992 to 2003 was undertaken using the National Library of Medicine and EMBASE databases. The cost-utility of ophthalmologic interventions in inflation-adjusted (real) year 2003 US dollars expended per quality-adjusted life-year ($/QALY) was ascertained in all instances. Results A total of 19 papers were found, including a total of 25 interventions. The median cost-utility of ophthalmologic interventions was $5,219/QALY, with a range from $746/QALY to $6.5 million/QALY. Conclusions The majority of ophthalmologic interventions are especially cost-effective by conventional standards. This is because of the substantial value that ophthalmologic interventions confer to patients with eye diseases for the resources expended. PMID:15747756

  11. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  12. Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-Rich DNA, and nuclear DNA analyses

    Science.gov (United States)

    Freeman, S.; Pham, M.; Rodriguez, R.J.

    1993-01-01

    Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-rich DNA, and nuclear DNA analyses. Experimental Mycology 17, 309-322. Isolates of Colletotrichum were grouped into 10 separate species based on arbitrarily primed PCR (ap-PCR), A + T-rich DNA (AT-DNA) and nuclear DNA banding patterns. In general, the grouping of Colletotrichum isolates by these molecular approaches corresponded to that done by classical taxonomic identification, however, some exceptions were observed. PCR amplification of genomic DNA using four different primers allowed for reliable differentiation between isolates of the 10 species. HaeIII digestion patterns of AT-DNA also distinguished between species of Colletotrichum by generating species-specific band patterns. In addition, hybridization of the repetitive DNA element (GcpR1) to genomic DNA identified a unique set of Pst 1-digested nuclear DNA fragments in each of the 10 species of Colletotrichum tested. Multiple isolates of C. acutatum, C. coccodes, C. fragariae, C. lindemuthianum, C. magna, C. orbiculare, C. graminicola from maize, and C. graminicola from sorghum showed 86-100% intraspecies similarity based on ap-PCR and AT-DNA analyses. Interspecies similarity determined by ap-PCR and AT-DNA analyses varied between 0 and 33%. Three distinct banding patterns were detected in isolates of C. gloeosporioides from strawberry. Similarly, three different banding patterns were observed among isolates of C. musae from diseased banana.

  13. Fuel assemblies mechanical behaviour improvements based on design changes and loading patterns computational analyses

    International Nuclear Information System (INIS)

    Marin, J.; Aullo, M.; Gutierrez, E.

    2001-01-01

    In the past few years, incomplete RCCA insertion events (IRI) have been taking place at some nuclear plants. Large guide thimble distortion caused by high compressive loads together with the irradiation induced material creep and growth, is considered as the primary cause of those events. This disturbing phenomenon is worsened when some fuel assemblies are deformed to the extent that they push the neighbouring fuel assemblies and the distortion is transmitted along the core. In order to better understand this mechanism, ENUSA has developed a methodology based on finite element core simulation to enable assessments on the propensity of a given core loading pattern to propagate the distortion along the core. At the same time, the core loading pattern could be decided interacting with nuclear design to obtain the optimum response under both, nuclear and mechanical point of views, with the objective of progressively attenuating the core distortion. (author)

  14. [The genotype-based haplotype relative risk and transmission disequilibrium test analyses of familial febrile convulsions].

    Science.gov (United States)

    Qi, Y; Wu, X; Guo, Z; Zhang, J; Pan, H; Li, M; Bao, X; Peng, J; Zou, L; Lin, Q

    1999-10-01

    To confirm the linkage of familial febrile convulsions to the short arm of chromosome 6(6p) or the long arm of chromosome 8(8q). The authors finished genotyping of Pst I locus on the coding region of heat shock protein (HSP) 70, 5'untranslated region of HSP70-1, 3' untranslated region of HSP70-2, D8S84 and D8S85. The data were processed by the genotype-based haplotype relative risk(GHRR) and transmission disequilibrium test(TDT) methods in PPAP. Some signs of association and disequilibrium between D8S85 and FC were shown by GHRR and TDT. A suspect linkage of familial febrile convulsions to the long arm of chromosome 8 has been proposed.

  15. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    Science.gov (United States)

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. Space nuclear-power reactor design based on combined neutronic and thermal-fluid analyses

    International Nuclear Information System (INIS)

    Koenig, D.R.; Gido, R.G.; Brandon, D.I.

    1985-01-01

    The design and performance analysis of a space nuclear-power system requires sophisticated analytical capabilities such as those developed during the nuclear rocket propulsion (Rover) program. In particular, optimizing the size of a space nuclear reactor for a given power level requires satisfying the conflicting requirements of nuclear criticality and heat removal. The optimization involves the determination of the coolant void (volume) fraction for which the reactor diameter is a minimum and temperature and structural limits are satisfied. A minimum exists because the critical diameter increases with increasing void fraction, whereas the reactor diameter needed to remove a specified power decreases with void fraction. The purpose of this presentation is to describe and demonstrate our analytical capability for the determination of minimum reactor size. The analysis is based on combining neutronic criticality calculations with OPTION-code thermal-fluid calculations

  17. Sensitivity studies for 3-D rod ejection analyses on axial power shape

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min-Ho; Park, Jin-Woo; Park, Guen-Tae; Ryu, Seok-Hee; Um, Kil-Sup; Lee, Jae-Il [KEPCO NF, Daejeon (Korea, Republic of)

    2015-10-15

    The current safety analysis methodology using the point kinetics model combined with numerous conservative assumptions result in unrealistic prediction of the transient behavior wasting huge margin for safety analyses while the safety regulation criteria for the reactivity initiated accident are going strict. To deal with this, KNF is developing a 3-D rod ejection analysis methodology using the multi-dimensional code coupling system CHASER. The CHASER system couples three-dimensional core neutron kinetics code ASTRA, sub-channel analysis code THALES, and fuel performance analysis code FROST using message passing interface (MPI). A sensitivity study for 3-D rod ejection analysis on axial power shape (APS) is carried out to survey the tendency of safety parameters by power distributions and to build up a realistic safety analysis methodology while maintaining conservatism. The currently developing 3-D rod ejection analysis methodology using the multi-dimensional core transient analysis code system, CHASER was shown to reasonably reflect the conservative assumptions by tuning up kinetic parameters.

  18. JUPITER and satellites: Clinical implications of the JUPITER study and its secondary analyses.

    Science.gov (United States)

    Kostapanos, Michael S; Elisaf, Moses S

    2011-07-26

    THE JUSTIFICATION FOR THE USE OF STATINS IN PREVENTION: an intervention trial evaluating rosuvastatin (JUPITER) study was a real breakthrough in primary cardiovascular disease prevention with statins, since it was conducted in apparently healthy individuals with normal levels of low-density lipoprotein cholesterol (LDL-C JUPITER, rosuvastatin was associated with significant reductions in cardiovascular outcomes as well as in overall mortality compared with placebo. In this paper the most important secondary analyses of the JUPITER trial are discussed, by focusing on their novel findings regarding the role of statins in primary prevention. Also, the characteristics of otherwise healthy normocholesterolemic subjects who are anticipated to benefit more from statin treatment in the clinical setting are discussed. Subjects at "intermediate" or "high" 10-year risk according to the Framingham score, those who exhibit low post-treatment levels of both LDL-C (JUPITER added to our knowledge that statins may be effective drugs in the primary prevention of cardiovascular disease in normocholesterolemic individuals at moderate-to-high risk. Also, statin treatment may reduce the risk of venous thromboembolism and preserve renal function. An increase in physician-reported diabetes represents a major safety concern associated with the use of the most potent statins.

  19. Levee reliability analyses for various flood return periods - a case study in southern Taiwan

    Science.gov (United States)

    Huang, W.-C.; Yu, H.-W.; Weng, M.-C.

    2015-04-01

    In recent years, heavy rainfall conditions have caused disasters around the world. To prevent losses by floods, levees have often been constructed in inundation-prone areas. This study performed reliability analyses for the Chiuliao First Levee in southern Taiwan. The failure-related parameters were the water level, the scouring depth, and the in situ friction angle. Three major failure mechanisms were considered: the slope sliding failure of the levee and the sliding and overturning failures of the retaining wall. When the variability of the in situ friction angle and the scouring depth are considered for various flood return periods, the variations of the factor of safety for the different failure mechanisms show that the retaining wall sliding and overturning failures are more sensitive to the change of the friction angle. When the flood return period is greater than 2 years, the levee could fail with slope sliding for all values of the water level difference. The results of levee stability analysis considering the variability of different parameters could aid engineers in designing the levee cross sections, especially with potential failure mechanisms in mind.

  20. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    Buck, John W.; McDonald, John P.; Taira, Randal Y.

    2002-01-01

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  1. Diffraction Studies from Minerals to Organics - Lessons Learned from Materials Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Whitfield, Pamela S [ORNL

    2014-01-01

    In many regards the study of materials and minerals by powder diffraction techniques are complimentary, with techniques honed in one field equally applicable to the other. As a long-time materials researcher many of the examples are of techniques developed for materials analysis applied to minerals. However in a couple of cases the study of new minerals was the initiation into techniques later used in materials-based studies. Hopefully they will show that the study of new minerals structures can provide opportunities to add new methodologies and approaches to future problems. In keeping with the AXAA many of the examples have an Australian connection, the materials ranging from organics to battery materials.

  2. Deconvoluting complex tissues for expression quantitative trait locus-based analyses

    DEFF Research Database (Denmark)

    Seo, Ji-Heui; Li, Qiyuan; Fatima, Aquila

    2013-01-01

    Breast cancer genome-wide association studies have pinpointed dozens of variants associated with breast cancer pathogenesis. The majority of risk variants, however, are located outside of known protein-coding regions. Therefore, identifying which genes the risk variants are acting through present...

  3. Construct Validity of the Posttraumatic Stress Disorder Checklist in Cancer Survivors: Analyses Based on Two Samples

    Science.gov (United States)

    DuHamel, Katherine N.; Ostrof, Jamie; Ashman, Teresa; Winkel, Gary; Mundy, Elizabeth A.; Keane, Terence M.; Morasco, Benjamin J.; Vickberg, Suzanne M. J.; Hurley, Karen; Chhabra, Rosy; Scigliano, Eileen; Papadopoulos, Esperanza; Moskowitz, Craig; Redd, William

    2004-01-01

    The measurement of posttraumatic stress disorder (PTSD) is critically important for the identification and treatment of this disorder. The PTSD Checklist (PCL; F. W. Weathers & J. Ford, 1996) is a self-report measure that is increasingly used. In this study, the authors investigated the factorial validity of the PCL with data from 236 cancer…

  4. Modelling and optimization of combined cycle power plant based on exergoeconomic and environmental analyses

    International Nuclear Information System (INIS)

    Ganjehkaviri, A.; Mohd Jaafar, M.N.; Ahmadi, P.; Barzegaravval, H.

    2014-01-01

    This research paper presents a study on a comprehensive thermodynamic modelling of a combined cycle power plant (CCPP). The effects of economic strategies and design parameters on the plant optimization are also studied. Exergoeconomic analysis is conducted in order to determine the cost of electricity and cost of exergy destruction. In addition, a comprehensive optimization study is performed to determine the optimal design parameters of the power plant. Next, the effects of economic parameters variations on the sustainability, carbon dioxide emission and fuel consumption of the plant are investigated and are presented for a typical combined cycle power plant. Therefore, the changes in economic parameters caused the balance between cash flows and fix costs of the plant changes at optimum point. Moreover, economic strategies greatly limited the maximum reasonable carbon emission and fuel consumption reduction. The results showed that by using the optimum values, the exergy efficiency increases for about 6%, while CO 2 emission decreases by 5.63%. However, the variation in the cost was less than 1% due to the fact that a cost constraint was implemented. In addition, the sensitivity analysis for the optimization study was curtailed to be carried out; therefore, the optimization process and results to two important parameters are presented and discussed.

  5. The Shortened Raven Standard Progressive Matrices: Item Response Theory-Based Psychometric Analyses and Normative Data

    Science.gov (United States)

    Van der Elst, Wim; Ouwehand, Carolijn; van Rijn, Peter; Lee, Nikki; Van Boxtel, Martin; Jolles, Jelle

    2013-01-01

    The purpose of the present study was to evaluate the psychometric properties of a shortened version of the Raven Standard Progressive Matrices (SPM) under an item response theory framework (the one- and two-parameter logistic models). The shortened Raven SPM was administered to N = 453 cognitively healthy adults aged between 24 and 83 years. The…

  6. A quantitative method to analyse an open answer questionnaire: A case study about the Boltzmann Factor

    International Nuclear Information System (INIS)

    Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2015-01-01

    This paper describes a quantitative method to analyse an openended questionnaire. Student responses to a specially designed written questionnaire are quantitatively analysed by not hierarchical clustering called k-means method. Through this we can characterise behaviour students with respect their expertise to formulate explanations for phenomena or processes and/or use a given model in the different context. The physics topic is about the Boltzmann Factor, which allows the students to have a unifying view of different phenomena in different contexts.

  7. Model-based analyses to compare health and economic outcomes of cancer control: inclusion of disparities.

    Science.gov (United States)

    Goldie, Sue J; Daniels, Norman

    2011-09-21

    Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without

  8. Fatigue Crack Propagation Under Variable Amplitude Loading Analyses Based on Plastic Energy Approach

    Directory of Open Access Journals (Sweden)

    Sofiane Maachou

    2014-04-01

    Full Text Available Plasticity effects at the crack tip had been recognized as “motor” of crack propagation, the growth of cracks is related to the existence of a crack tip plastic zone, whose formation and intensification is accompanied by energy dissipation. In the actual state of knowledge fatigue crack propagation is modeled using crack closure concept. The fatigue crack growth behavior under constant amplitude and variable amplitude loading of the aluminum alloy 2024 T351 are analyzed using in terms energy parameters. In the case of VAL (variable amplitude loading tests, the evolution of the hysteretic energy dissipated per block is shown similar with that observed under constant amplitude loading. A linear relationship between the crack growth rate and the hysteretic energy dissipated per block is obtained at high growth rates. For lower growth rates values, the relationship between crack growth rate and hysteretic energy dissipated per block can represented by a power law. In this paper, an analysis of fatigue crack propagation under variable amplitude loading based on energetic approach is proposed.

  9. A method of mounting multiple otoliths for beam-based microchemical analyses

    Science.gov (United States)

    Donohoe, C.J.; Zimmerman, C.E.

    2010-01-01

    Beam-based analytical methods are widely used to measure the concentrations of elements and isotopes in otoliths. These methods usually require that otoliths be individually mounted and prepared to properly expose the desired growth region to the analytical beam. Most analytical instruments, such as LA-ICPMS and ion and electron microprobes, have sample holders that will accept only one to six slides or mounts at a time. We describe a method of mounting otoliths that allows for easy transfer of many otoliths to a single mount after they have been prepared. Such an approach increases the number of otoliths that can be analyzed in a single session by reducing the need open the sample chamber to exchange slides-a particularly time consuming step on instruments that operate under vacuum. For ion and electron microprobes, the method also greatly reduces the number of slides that must be coated with an electrical conductor prior to analysis. In this method, a narrow strip of cover glass is first glued at one end to a standard microscope slide. The otolith is then mounted in thermoplastic resin on the opposite, free end of the strip. The otolith can then be ground and flipped, if needed, by reheating the mounting medium. After otolith preparation is complete, the cover glass is cut with a scribe to free the otolith and up to 20 small otoliths can be arranged on a single petrographic slide. ?? 2010 The Author(s).

  10. Beam transient analyses of Accelerator Driven Subcritical Reactors based on neutron transport method

    Energy Technology Data Exchange (ETDEWEB)

    He, Mingtao; Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China); Zheng, Youqi, E-mail: yqzheng@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China); Wang, Kunpeng [Nuclear and Radiation Safety Center, PO Box 8088, Beijing 100082 (China); Li, Xunzhao; Zhou, Shengcheng [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China)

    2015-12-15

    Highlights: • A transport-based kinetics code for Accelerator Driven Subcritical Reactors is developed. • The performance of different kinetics methods adapted to the ADSR is investigated. • The impacts of neutronic parameters deteriorating with fuel depletion are investigated. - Abstract: The Accelerator Driven Subcritical Reactor (ADSR) is almost external source dominated since there is no additional reactivity control mechanism in most designs. This paper focuses on beam-induced transients with an in-house developed dynamic analysis code. The performance of different kinetics methods adapted to the ADSR is investigated, including the point kinetics approximation and space–time kinetics methods. Then, the transient responds of beam trip and beam overpower are calculated and analyzed for an ADSR design dedicated for minor actinides transmutation. The impacts of some safety-related neutronics parameters deteriorating with fuel depletion are also investigated. The results show that the power distribution varying with burnup leads to large differences in temperature responds during transients, while the impacts of kinetic parameters and feedback coefficients are not very obvious. Classification: Core physic.

  11. Three-dimensional finite element model for flexible pavement analyses based field modulus measurements

    International Nuclear Information System (INIS)

    Lacey, G.; Thenoux, G.; Rodriguez-Roa, F.

    2008-01-01

    In accordance with the present development of empirical-mechanistic tools, this paper presents an alternative to traditional analysis methods for flexible pavements using a three-dimensional finite element formulation based on a liner-elastic perfectly-plastic Drucker-Pager model for granular soil layers and a linear-elastic stress-strain law for the asphalt layer. From the sensitivity analysis performed, it was found that variations of +-4 degree in the internal friction angle of granular soil layers did not significantly affect the analyzed pavement response. On the other hand, a null dilation angle is conservatively proposed for design purposes. The use of a Light Falling Weight Deflectometer is also proposed as an effective and practical tool for on-site elastic modulus determination of granular soil layers. However, the stiffness value obtained from the tested layer should be corrected when the measured peak deflection and the peak force do not occur at the same time. In addition, some practical observations are given to achieve successful field measurements. The importance of using a 3D FE analysis to predict the maximum tensile strain at the bottom of the asphalt layer (related to pavement fatigue) and the maximum vertical comprehensive strain transmitted to the top of the granular soil layers (related to rutting) is also shown. (author)

  12. Determination of radioactive emission origins based on analyses of isotopic composition

    International Nuclear Information System (INIS)

    Devell, L.

    1987-01-01

    The nature of radioactivity emissions can be determined through gamma spectroscopy of air samples with good precision, which means that the type of source of the emission may be found, e.g. nuclear weapons test, of nuclear power plant accident. Combined with information on wind trajectories it is normally possible to recognize time and area for the emission. In this preliminary study, the knowledge of and preparedness for such measurements are described. (L.E.)

  13. The value of area-based analyses of donation patterns for recruitment strategies.

    Science.gov (United States)

    James, Adelbert B; Josephson, Cassandra D; Shaz, Beth H; Schreiber, George B; Hillyer, Christopher D; Roback, John D

    2014-12-01

    Lack of ready access to a donation site may be a potential barrier to or influence the frequency of blood donations. In this study, we applied geographic analysis to blood donor behavior and use of different donation sites. The study population consisted of blood donors who gave whole blood in Georgia between 2004 and 2008. Zip code, city, and county of donor's residence were matched with the addresses of their donation sites. Donors were dichotomized as either nonmetro Atlanta or metro Atlanta residents. Six donation site categories were defined: donation within the same or a different zip code, within the same or a different city, and within the same or a different county. Logistic regression was used to compare donations by zip code, city, and county. The study population consisted of 402,692 blood donors who donated 1,147,442 whole blood units between 2004 and 2008, more than half of whom (56.4%) resided in the metro Atlanta area. The majority of donors were white (75.0%) and female (55.7%). In nonmetro Atlanta, repeat donors were more likely to have donated at fixed sites (p recruitment strategies. © 2014 AABB.

  14. Distribution of magnetic particulates in a roadside snowpack based on magnetic, microstructural and mineralogical analyses

    Science.gov (United States)

    Bućko, Michał S.; Mattila, Olli-Pekka; Chrobak, Artur; Ziółkowski, Grzegorz; Johanson, Bo; Čuda, Jan; Filip, Jan; Zbořil, Radek; Pesonen, Lauri J.; Leppäranta, Matti

    2013-10-01

    Vehicle traffic is at present one of the major sources of environmental pollution in urban areas. Magnetic parameters are successfully applied in environmental studies to obtain detailed information about concentrations and quality of iron-bearing minerals. A general aim of this research was to investigate the magnetic, microstructural and mineralogical properties of dust extracted from the roadside snowpack accumulated on the side of an urban highway, northern Helsinki. Vertical snow profiles were taken at different distances (5, 10 and 15 m) from the road edge, during winter season 2010-2011. The temporal distribution of mass magnetic susceptibility (χ) of the road dust shows that the concentration of magnetic particles increases in the snowpack during winter. Roadside snowpack preserves a large fraction of the magnetic particulate until the late stages of melting and this could be considered as one of the main factors responsible for the resuspension phenomenon observed in Nordic countries. The vertical distribution of χ and SIRM (saturation isothermal remanent magnetization)/χ ratio may indicate the migration of magnetic particles down in the snowpack during melting conditions. Ultrafine to coarse-grained (superparamagnetic to multidomain) magnetite was identified as the primary magnetic mineral in all the studied road dust samples. The examined road dust contains significant amount of dia/paramagnetic minerals (e.g. quartz, albite, biotite) and the content of magnetite is relatively low (below 1 weight percent, wt%). The roadside snowpack is enriched in anthropogenic particles such as angular and spherical iron-oxides, tungsten-rich particles and sodium chloride. This study demonstrates the suitability of snow as an efficient collecting medium of magnetic particulates generated by anthropogenic activities.

  15. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    OpenAIRE

    Sung-Chien Lin

    2014-01-01

    In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results ...

  16. Structural and functional analyses of human cerebral cortex using a surface-based atlas

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.

    1997-01-01

    We have analyzed the geometry, geography, and functional organization of human cerebral cortex using surface reconstructions and cortical flat maps of the left and right hemispheres generated from a digital atlas (the Visible Man). The total surface area of the reconstructed Visible Man neocortex is 1570 cm2 (both hemispheres), approximately 70% of which is buried in sulci. By linking the Visible Man cerebrum to the Talairach stereotaxic coordinate space, the locations of activation foci reported in neuroimaging studies can be readily visualized in relation to the cortical surface. The associated spatial uncertainty was empirically shown to have a radius in three dimensions of approximately 10 mm. Application of this approach to studies of visual cortex reveals the overall patterns of activation associated with different aspects of visual function and the relationship of these patterns to topographically organized visual areas. Our analysis supports a distinction between an anterior region in ventral occipito-temporal cortex that is selectively involved in form processing and a more posterior region (in or near areas VP and V4v) involved in both form and color processing. Foci associated with motion processing are mainly concentrated in a region along the occipito-temporal junction, the ventral portion of which overlaps with foci also implicated in form processing. Comparisons between flat maps of human and macaque monkey cerebral cortex indicate significant differences as well as many similarities in the relative sizes and positions of cortical regions known or suspected to be homologous in the two species.

  17. Back-Analyses of Landfill Instability Induced by High Water Level: Case Study of Shenzhen Landfill

    Directory of Open Access Journals (Sweden)

    Ren Peng

    2016-01-01

    Full Text Available In June 2008, the Shenzhen landfill slope failed. This case is used as an example to study the deformation characteristics and failure mode of a slope induced by high water levels. An integrated monitoring system, including water level gauges, electronic total stations, and inclinometers, was used to monitor the slope failure process. The field measurements suggest that the landfill landslide was caused by a deep slip along the weak interface of the composite liner system at the base of the landfill. The high water level is considered to be the main factor that caused this failure. To calculate the relative interface shear displacements in the geosynthetic multilayer liner system, a series of numerical direct shear tests were carried out. Based on the numerical results, the composite lining system simplified and the centrifuge modeling technique was used to quantitatively evaluate the effect of water levels on landfill instability.

  18. Back-Analyses of Landfill Instability Induced by High Water Level: Case Study of Shenzhen Landfill

    Science.gov (United States)

    Peng, Ren; Hou, Yujing; Zhan, Liangtong; Yao, Yangping

    2016-01-01

    In June 2008, the Shenzhen landfill slope failed. This case is used as an example to study the deformation characteristics and failure mode of a slope induced by high water levels. An integrated monitoring system, including water level gauges, electronic total stations, and inclinometers, was used to monitor the slope failure process. The field measurements suggest that the landfill landslide was caused by a deep slip along the weak interface of the composite liner system at the base of the landfill. The high water level is considered to be the main factor that caused this failure. To calculate the relative interface shear displacements in the geosynthetic multilayer liner system, a series of numerical direct shear tests were carried out. Based on the numerical results, the composite lining system simplified and the centrifuge modeling technique was used to quantitatively evaluate the effect of water levels on landfill instability. PMID:26771627

  19. Investigation of a wet ethanol operated HCCI engine based on first and second law analyses

    International Nuclear Information System (INIS)

    Khaliq, Abdul; Trivedi, Shailesh K.; Dincer, Ibrahim

    2011-01-01

    are in the HCCI engine (around 89%) followed by fuel vaporizer (4.9%) and catalytic converter (4.5%). → Based on simulation results, it is found that second law efficiency of wet ethanol operated HCCI engine is higher than the pure ethanol fuelled HCCI engine.

  20. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 1

    International Nuclear Information System (INIS)

    1983-01-01

    This documentation of the activities of the Oeko-Institut is intended to show errors made and limits encountered in the experimental approaches and in results obtained by the work performed under phase A of the German Risk Assessment Study on Nuclear Power Plants (DRS). Concern is expressed and explained relating to the risk definition used in the Study, and the results of other studies relied on; specific problems of methodology are discussed with regard to the value of fault-tree/accident analyses for describing the course of safety-related events, and to the evaluations presented in the DRS. The Markov model is explained as an approach offering alternative solutions. The identification and quantification of common-mode failures is discussed. Origin, quality and methods of assessing the reliability characteristics used in the DRS as well as the statistical models for describing failure scenarios of reactor components and systems are critically reviewed. (RF) [de

  1. Genetic architecture and bottleneck analyses of Salem Black goat breed based on microsatellite markers

    Directory of Open Access Journals (Sweden)

    A. K. Thiruvenkadan

    2014-09-01

    Full Text Available Aim: The present study was undertaken in Salem Black goat population for genetic analysis at molecular level to exploit the breed for planning sustainable improvement, conservation and utilization, which subsequently can improve the livelihood of its stakeholders. Materials and Methods: Genomic DNA was isolated from blood samples of 50 unrelated Salem Black goats with typical phenotypic features in several villages in the breeding tract and the genetic characterization and bottleneck analysis in Salem Black goat was done using 25 microsatellite markers as recommended by the Food and Agricultural Organization, Rome, Italy. The basic measures of genetic variation were computed using bioinformatic software. To evaluate the Salem Black goats for mutation drift equilibrium, three tests were performed under three different mutation models, viz., infinite allele model (IAM, stepwise mutation model (SMM and two-phase model (TPM and the observed gene diversity (He and expected equilibrium gene diversity (Heq were estimated under different models of microsatellite evolution. Results: The study revealed that the observed number of alleles ranged from 4 (ETH10, ILSTS008 to 17 (BM64444 with a total of 213 alleles and mean of 10.14±0.83 alleles across loci. The overall observed heterozygosity, expected heterozygosity, inbreeding estimate and polymorphism information content values were 0.631±0.041, 0.820±0.024, 0.233±0.044 and 0.786±0.023 respectively indicating high genetic diversity. The average observed gene diversities (He pooled over different markers was 0.829±0.024 and the average expected gene diversities under IAM, TPM and SMM models were 0.769±0.026, 0.808±0.024 and 0.837±0.020 respectively. The number of loci found to exhibit gene diversity excess under IAM, TPM and SMM models were 18, 17 and 12 respectively. Conclusion: All the three statistical tests, viz., sign test, standardized differences test and Wilcoxon sign rank test, revealed

  2. A comparative study of cold- and warm-adapted Endonucleases A using sequence analyses and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Davide Michetti

    Full Text Available The psychrophilic and mesophilic endonucleases A (EndA from Aliivibrio salmonicida (VsEndA and Vibrio cholera (VcEndA have been studied experimentally in terms of the biophysical properties related to thermal adaptation. The analyses of their static X-ray structures was no sufficient to rationalize the determinants of their adaptive traits at the molecular level. Thus, we used Molecular Dynamics (MD simulations to compare the two proteins and unveil their structural and dynamical differences. Our simulations did not show a substantial increase in flexibility in the cold-adapted variant on the nanosecond time scale. The only exception is a more rigid C-terminal region in VcEndA, which is ascribable to a cluster of electrostatic interactions and hydrogen bonds, as also supported by MD simulations of the VsEndA mutant variant where the cluster of interactions was introduced. Moreover, we identified three additional amino acidic substitutions through multiple sequence alignment and the analyses of MD-based protein structure networks. In particular, T120V occurs in the proximity of the catalytic residue H80 and alters the interaction with the residue Y43, which belongs to the second coordination sphere of the Mg2+ ion. This makes T120V an amenable candidate for future experimental mutagenesis.

  3. Comparison of Internal Fixations for Distal Clavicular Fractures Based on Loading Tests and Finite Element Analyses

    Directory of Open Access Journals (Sweden)

    Rina Sakai

    2014-01-01

    Full Text Available It is difficult to apply strong and stable internal fixation to a fracture of the distal end of the clavicle because it is unstable, the distal clavicle fragment is small, and the fractured region is near the acromioclavicular joint. In this study, to identify a superior internal fixation method for unstable distal clavicular fracture, we compared three types of internal fixation (tension band wiring, scorpion, and LCP clavicle hook plate. Firstly, loading tests were performed, in which fixations were evaluated using bending stiffness and torsional stiffness as indices, followed by finite element analysis to evaluate fixability using the stress and strain as indices. The bending and torsional stiffness were significantly higher in the artificial clavicles fixed with the two types of plate than in that fixed by tension band wiring (P<0.05. No marked stress concentration on the clavicle was noted in the scorpion because the arm plate did not interfere with the acromioclavicular joint, suggesting that favorable shoulder joint function can be achieved. The stability of fixation with the LCP clavicle hook plate and the scorpion was similar, and plate fixations were stronger than fixation by tension band wiring.

  4. Soft-computing base analyses of the relationship between annoyance and coping with noise and odor.

    Science.gov (United States)

    Botteldooren, Dick; Lercher, Peter

    2004-06-01

    The majority of research on annoyance as an important impact of noise, odor, and other stressors on man, has regarded the person as a passive receptor. It was however recognized that this person is an active participant trying to alter a troubled person-environment relationship or to sustain a desirable one. Coping has to be incorporated. This is of particular importance in changing exposure situations. For large populations a lot of insight can be gained by looking at average effects only. To investigate changes in annoyance and effects of coping, the individual or small group has to be studied. Then it becomes imperative to recognize the inherent vagueness in perception and human behavior. Fortunately, tools have been developed over the past decades that allow doing this in a mathematically precise way. These tools are sometimes referred to by the common label: soft-computing, hence the title of this paper. This work revealed different styles of coping both by blind clustering and by (fuzzy) logical aggregation of different actions reported in a survey. The relationship between annoyance and the intensity of coping it generates was quantified after it was recognized that the possibility for coping is created by the presence of the stressor rather than the actual fact of coping. It was further proven that refinement of this relationship is possible if a person can be identified as a coper. This personal factor can be extracted from a known reaction to one stressor and be used for predicting coping intensity and style in another situation. The effect of coping on a perceived change in annoyance is quantified by a set of fuzzy linguistic rules. This closes the loop that is responsible for at least some of the dynamics of the response to a stressor. This work thus provides all essential building blocks for designing models for annoyance in changing environments.

  5. Analysing Regional Land Surface Temperature Changes by Satellite Data, a Case Study of Zonguldak, Turkey

    Science.gov (United States)

    Sekertekin, A.; Kutoglu, S.; Kaya, S.; Marangoz, A. M.

    2014-12-01

    In recent years, climate change is one of the most important problems that the ecological system of the world has been encountering. Global warming and climate change have been studied frequently by all disciplines all over the world and Geomatics Engineering also contributes to such studies by means of remote sensing, global positioning system etc. Monitoring Land Surface Temperature (LST) via remote sensing satellites is one of the most important contributions to climatology. LST is an important parameter governing the energy balance on the Earth and there are lots of algorithms to obtain LST by remote sensing techniques. The most commonly used algorithms are split-window algorithm, temperature/emissivity separation method, mono-window algorithm and single channel method. Generally three algorithms are used to obtain LST by using Landsat 5 TM data. These algorithms are radiative transfer equation method, single channel method and mono-window algorithm. Radiative transfer equation method is not applicable because during the satellite pass, atmospheric parameters must be measured in-situ. In this research, mono window algorithm was implemented to Landsat 5 TM image. Besides, meteorological data such as humidity and temperature are used in the algorithm. Acquisition date of the image is 28.08.2011 and our study area is Zonguldak, Turkey. High resolution images are used to investigate the relationships between LST and land cover type. As a result of these analyses, area with vegetation cover has approximately 5 ºC lower temperature than the city center and arid land. Because different surface properties like reinforced concrete construction, green zones and sandbank are all together in city center, LST differs about 10 ºC in the city center. The temperature around some places in thermal power plant region Çatalağzı, is about 5 ºC higher than city center. Sandbank and agricultural areas have highest temperature because of land cover structure. Thanks to this

  6. International competence and knowledge studies and attitudes of the Brazilian Management accountant: analyses and reflections

    Directory of Open Access Journals (Sweden)

    Ricardo Lopes Cardoso

    2010-01-01

    Full Text Available The main purpose of this study is to understand what the competences of the Management accountant are, compare to international studies and assess the existence of competences to be prioritized. This questioning has as motivation the placements of Hardern (1995, Morgan (1997, IMA (1996 e 1999 and IFAC (2003. The theoretical basis about competences is related to McClelland study (1973, 1998, Boyatzis (1982 and Spencer and Spencer (1993. This research is based on the study of 18 competences about knowledge, skills and attitudes obtained in accountant literature and that have been submitted to 200 Management accountants. Data collection instrument presented a 0.884 Cronbach Alpha. From a factorial analysis and after Kruskal-Wallis test 12 competences were obtained as the most relevant segregated in 3 factors, in comparison to international studies of nine common competences 4 were not considered relevant in statistical tests and only one must be prioritized. Results demonstrate differences between competences required from Brazilian Management accountants and from other countries, being that their reasons is an open-ended question up to the moment.

  7. Palaeohydrology of the Southwest Yukon Territory, Canada, based on multiproxy analyses of lake sediment cores from a depth transect

    Science.gov (United States)

    Anderson, L.; Abbott, M.B.; Finney, B.P.; Edwards, M.E.

    2005-01-01

    Lake-level variations at Marcella Lake, a small, hydrologically closed lake in the southwestern Yukon Territory, document changes in effective moisture since the early Holocene. Former water levels, driven by regional palaeohydrology, were reconstructed by multiproxy analyses of sediment cores from four sites spanning shallow to deep water. Marcella Lake today is thermally stratified, being protected from wind by its position in a depression. It is alkaline and undergoes bio-induced calcification. Relative accumulations of calcium carbonate and organic matter at the sediment-water interface depend on the location of the depositional site relative to the thermocline. We relate lake-level fluctuations to down-core stratigraphic variations in composition, geochemistry, sedimentary structures and to the occurrence of unconformities in four cores based on observations of modern limnology and sedimentation processes. Twenty-four AMS radiocarbon dates on macrofossils and pollen provide the lake-level chronology. Prior to 10 000 cal. BP water levels were low, but then they rose to 3 to 4 m below modern levels. Between 7500 and 5000 cal. BP water levels were 5 to 6 m below modern but rose by 4000 cal. BP. Between 4000 and 2000 cal. BP they were higher than modern. During the last 2000 years, water levels were either near or 1 to 2 m below modern levels. Marcella Lake water-level fluctuations correspond with previously documented palaeoenvironmental and palaeoclimatic changes and provide new, independent effective moisture information. The improved geochronology and quantitative water-level estimates are a framework for more detailed studies in the southwest Yukon. ?? 2005 Edward Arnold (Publishers) Ltd.

  8. Subtypes of familial hemophagocytic lymphohistiocytosis in Japan based on genetic and functional analyses of cytotoxic T lymphocytes.

    Directory of Open Access Journals (Sweden)

    Kozo Nagai

    Full Text Available BACKGROUND: Familial hemophagocytic lymphohistiocytosis (FHL is a rare disease of infancy or early childhood. To clarify the incidence and subtypes of FHL in Japan, we performed genetic and functional analyses of cytotoxic T lymphocytes (CTLs in Japanese patients with FHL. DESIGN AND METHODS: Among the Japanese children with hemophagocytic lymphohistiocytosis (HLH registered at our laboratory, those with more than one of the following findings were eligible for study entry under a diagnosis of FHL: positive for known genetic mutations, a family history of HLH, and impaired CTL-mediated cytotoxicity. Mutations of the newly identified causative gene for FHL5, STXBP2, and the cytotoxicity and degranulation activity of CTLs in FHL patients, were analyzed. RESULTS: Among 31 FHL patients who satisfied the above criteria, PRF1 mutation was detected in 17 (FHL2 and UNC13D mutation was in 10 (FHL3. In 2 other patients, 3 novel mutations of STXBP2 gene were confirmed (FHL5. Finally, the remaining 2 were classified as having FHL with unknown genetic mutations. In all FHL patients, CTL-mediated cytotoxicity was low or deficient, and degranulation activity was also low or absent except FHL2 patients. In 2 patients with unknown genetic mutations, the cytotoxicity and degranulation activity of CTLs appeared to be deficient in one patient and moderately impaired in the other. CONCLUSIONS: FHL can be diagnosed and classified on the basis of CTL-mediated cytotoxicity, degranulation activity, and genetic analysis. Based on the data obtained from functional analysis of CTLs, other unknown gene(s responsible for FHL remain to be identified.

  9. Period Study and Analyses of 2017 Observations of the Totally Eclipsing, Solar Type Binary, MT Camelopardalis

    Science.gov (United States)

    Faulkner, Danny R.; Samec, Ronald G.; Caton, Daniel B.

    2018-06-01

    We report here on a period study and the analysis of BVRcIc light curves (taken in 2017) of MT Cam (GSC03737-01085), which is a solar type (T ~ 5500K) eclipsing binary. D. Caton observed MT Cam on 05, 14, 15, 16, and 17, December 2017 with the 0.81-m reflector at Dark Sky Observatory. Six times of minimum light were calculated from four primary eclipses and two secondary eclipses:HJD I = 24 58092.4937±0.0002, 2458102.74600±0.0021, 2458104.5769±0.0002, 2458104.9434±0.0029HJD II = 2458103.6610±0.0001, 2458104.7607±0.0020,Six times of minimum light were also calculated from data taken by Terrell, Gross, and Cooney, in their 2016 and 2004 observations (reported in IBVS #6166; TGC, hereafter). In addition, six more times of minimum light were taken from the literature. From all 18 times of minimum light, we determined the following light elements:JD Hel Min I=2458102.7460(4) + 0.36613937(5) EWe found the orbital period was constant over the 14 years spanning all observations. We note that TGC found a slightly increasing period. However, our results were obtained from a period study rather than comparison of observations from only two epochs by the Wilson-Devinney (W-D) Program. A BVRcIc Johnson-Cousins filtered simultaneous W-D Program solution gives a mass ratio (0.3385±0.0014) very nearly the same as TGC’s (0.347±0.003), and a component temperature difference of only ~40 K. As with TGC, no spot was needed in the modeling. Our modeling (beginning with Binary Maker 3.0 fits) was done without prior knowledge of TGC’s. This shows the agreement achieved when independent analyses are done with the W-D code. The present observations were taken 1.8 years later than the last curves by TGC, so some variation is expected.The Roche Lobe fill-out of the binary is ~13% and the inclination is ~83.5 degrees. The system is a shallow contact W-type W UMa Binary, albeit, the amplitudes of the primary and secondary eclipse are very nearly identical. An eclipse duration of ~21

  10. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  11. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  12. Studies analysing the need for health-related information in Germany - a systematic review.

    Science.gov (United States)

    Pieper, Dawid; Jülich, Fabian; Antoine, Sunya-Lee; Bächle, Christina; Chernyak, Nadja; Genz, Jutta; Eikermann, Michaela; Icks, Andrea

    2015-09-23

    Exploring health-related information needs is necessary to better tailor information. However, there is a lack of systematic knowledge on how and in which groups information needs has been assessed, and which information needs have been identified. We aimed to assess the methodology of studies used to assess information needs, as well as the topics and extent of health-related information needs and associated factors in Germany. A systematic search was performed in Medline, Embase, Psycinfo, and all databases of the Cochrane Library. All studies investigating health-related information needs in patients, relatives, and the general population in Germany that were published between 2000 and 2012 in German or English were included. Descriptive content analysis was based on predefined categories. We identified 19 studies. Most studies addressed cancer or rheumatic disease. Methods used were highly heterogeneous. Apart from common topics such as treatment, diagnosis, prevention and health promotion, etiology and prognosis, high interest ratings were also found in more specific topics such as complementary and alternative medicine or nutrition. Information needs were notable in all surveyed patient groups, relatives, and samples of the general population. Younger age, shorter duration of illness, poorer health status and higher anxiety and depression scores appeared to be associated with higher information needs. Knowledge about information needs is still scarce. Assuming the importance of comprehensive information to enable people to participate in health-related decisions, further systematic research is required.

  13. Design and development of microcontroller-based clinical chemistry analyser for measurement of various blood biochemistry parameters.

    Science.gov (United States)

    Taneja, S R; Gupta, R C; Kumar, Jagdish; Thariyan, K K; Verma, Sanjeev

    2005-01-01

    Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.

  14. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Candelariella placodizans (Candelariaceae reported new to mainland China and Taiwan based on morphological, chemical and molecular phylogenetic analyses

    Directory of Open Access Journals (Sweden)

    Lidia Yakovchenko

    2016-06-01

    Full Text Available Candelariella placodizans is newly reported from China. It was collected on exposed rocks with mosses on the alpine areas of Taiwan and Yunnan Province, China at elevation between 3200-4400 m. Molecular phylogenetic analyses based on ITS rDNA sequences were also performed to confirm the monophyly of the Chinese populations with respect to already existing sequences of the species, and then further to examine their relationships to other members of the genus. An identification key to all 14 known taxa of Candelariella in China is provided.

  16. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Dincer, Ibrahim; Rosen, Marc A.

    2011-01-01

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO 2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO 2 emissions of the overall plant. The environmental impact in terms of CO 2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO 2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  17. Using a laser-based CO2 carbon isotope analyser to investigate gas transfer in geological media

    International Nuclear Information System (INIS)

    Guillon, S.; Pili, E.; Agrinier, P.

    2012-01-01

    CO 2 stable carbon isotopes are very attractive in environmental research to investigate both natural and anthropogenic carbon sources. Laser-based CO 2 carbon isotope analysis provides continuous measurement at high temporal resolution and is a promising alternative to isotope ratio mass spectrometry (IRMS). We performed a thorough assessment of a commercially available CO 2 Carbon Isotope Analyser (CCIA DLT-100, Los Gatos Research) that allows in situ measurement of C-13 in CO 2 . Using a set of reference gases of known CO 2 concentration and carbon isotopic composition, we evaluated the precision, long-term stability, temperature sensitivity and concentration dependence of the analyser. Despite good precision calculated from Allan variance (5.0 ppm for CO 2 concentration, and 0.05 per thousand for δC-13 at 60 s averaging), real performances are altered by two main sources of error: temperature sensitivity and dependence of C-13 on CO 2 concentration. Data processing is required to correct for these errors. Following application of these corrections, we achieve an accuracy of 8.7 ppm for CO 2 concentration and 1.3 per thousand for δC-13, which is worse compared to mass spectrometry performance, but still allowing field applications. With this portable analyser we measured CO 2 flux degassed from rock in an underground tunnel. The obtained carbon isotopic composition agrees with IRMS measurement, and can be used to identify the carbon source. (authors)

  18. Structural changes in Parkinson's disease: voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake.

    Science.gov (United States)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Yamaguchi, Hiroo; Kira, Jun-Ichi; Honda, Hiroshi

    2017-12-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p  90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p based morphometry can detect grey matter changes in Parkinson's disease. • Diffusion tensor imaging can detect white matter changes in Parkinson's disease.

  19. Implication of the cause of differences in 3D structures of proteins with high sequence identity based on analyses of amino acid sequences and 3D structures.

    Science.gov (United States)

    Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi

    2014-09-18

    Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.

  20. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    International Nuclear Information System (INIS)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-01

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  1. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    Energy Technology Data Exchange (ETDEWEB)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-15

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  2. Energy and exergy analyses of Photovoltaic/Thermal flat transpired collectors: Experimental and theoretical study

    International Nuclear Information System (INIS)

    Gholampour, Maysam; Ameri, Mehran

    2016-01-01

    Highlights: • A Photovoltaic/Thermal flat transpired collector was theoretically and experimentally studied. • Performance of PV/Thermal flat transpired plate was evaluated using equivalent thermal, first, and second law efficiencies. • According to the actual exergy gain, a critical radiation level was defined and its effect was investigated. • As an appropriate tool, equivalent thermal efficiency was used to find optimum suction velocity and PV coverage percent. - Abstract: PV/Thermal flat transpired plate is a kind of air-based hybrid Photovoltaic/Thermal (PV/T) system concurrently producing both thermal and electrical energy. In order to develop a predictive model, validate, and investigate the PV/Thermal flat transpired plate capabilities, a prototype was fabricated and tested under outdoor conditions at Shahid Bahonar University of Kerman in Kerman, Iran. In order to develop a mathematical model, correlations for Nusselt numbers for PV panel and transpired plate were derived using CFD technique. Good agreement was obtained between measured and simulated values, with the maximum relative root mean square percent deviation (RMSE) being 9.13% and minimum correlation coefficient (R-squared) 0.92. Based on the critical radiation level defined in terms of the actual exergy gain, it was found that with proper fan and MPPT devices, there is no concern about the critical radiation level. To provide a guideline for designers, using equivalent thermal efficiency as an appropriate tool, optimum values for suction velocity and PV coverage percent under different conditions were obtained.

  3. Analyses of robot systems using fault and event trees: case studies

    International Nuclear Information System (INIS)

    Khodabandehloo, Koorosh

    1996-01-01

    Safety in the use of robotics outside factories or processing plants has become a matter of great international concern. Domestic robots and those intended to assist nurses and surgeons in hospitals are examples of cases where safety and reliability are considered critical. The safe performance of robot systems depends on many factors, including the integrity of the robot's hardware and software, the way it communicates with sensory and other production equipment, the reliable function of the safety features present and the way the robot interacts with its environment. The use of systematic techniques such as Fault and Event Tree analysis to examine the safety and reliability of a given robotic system is presented. Considerable knowledge is needed before the application of such analysis techniques can be translated into safety specifications or indeed 'fail-safe' design features of robotic systems. The skill and understanding required for the formulation of such specifications is demonstrated here based on a number of case studies

  4. Disagreements in meta-analyses using outcomes measured on continuous or rating scales: observer agreement study

    DEFF Research Database (Denmark)

    Tendal, Britta; Higgins, Julian P T; Jüni, Peter

    2009-01-01

    difference (SMD), the protocols for the reviews and the trial reports (n=45) were retrieved. DATA EXTRACTION: Five experienced methodologists and five PhD students independently extracted data from the trial reports for calculation of the first SMD result in each review. The observers did not have access...... to the reviews but to the protocols, where the relevant outcome was highlighted. The agreement was analysed at both trial and meta-analysis level, pairing the observers in all possible ways (45 pairs, yielding 2025 pairs of trials and 450 pairs of meta-analyses). Agreement was defined as SMDs that differed less...... than 0.1 in their point estimates or confidence intervals. RESULTS: The agreement was 53% at trial level and 31% at meta-analysis level. Including all pairs, the median disagreement was SMD=0.22 (interquartile range 0.07-0.61). The experts agreed somewhat more than the PhD students at trial level (61...

  5. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    Science.gov (United States)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  6. Models and analyses for inertial-confinement fusion-reactor studies

    International Nuclear Information System (INIS)

    Bohachevsky, I.O.

    1981-05-01

    This report describes models and analyses devised at Los Alamos National Laboratory to determine the technical characteristics of different inertial confinement fusion (ICF) reactor elements required for component integration into a functional unit. We emphasize the generic properties of the different elements rather than specific designs. The topics discussed are general ICF reactor design considerations; reactor cavity phenomena, including the restoration of interpulse ambient conditions; first-wall temperature increases and material losses; reactor neutronics and hydrodynamic blanket response to neutron energy deposition; and analyses of loads and stresses in the reactor vessel walls, including remarks about the generation and propagation of very short wavelength stress waves. A discussion of analytic approaches useful in integrations and optimizations of ICF reactor systems concludes the report

  7. Scientometric analyses of studies on the role of innate variation in athletic performance.

    Science.gov (United States)

    Lombardo, Michael P; Emiah, Shadie

    2014-01-01

    Historical events have produced an ideologically charged atmosphere in the USA surrounding the potential influences of innate variation on athletic performance. We tested the hypothesis that scientific studies of the role of innate variation in athletic performance were less likely to have authors with USA addresses than addresses elsewhere because of this cultural milieu. Using scientometric data collected from 290 scientific papers published in peer-reviewed journals from 2000-2012, we compared the proportions of authors with USA addresses with those that listed addresses elsewhere that studied the relationships between athletic performance and (a) prenatal exposure to androgens, as indicated by the ratio between digits 2 and 4, and (b) the genotypes for angiotensin converting enzyme, α-actinin-3, and myostatin; traits often associated with athletic performance. Authors with USA addresses were disproportionately underrepresented on papers about the role of innate variation in athletic performance. We searched NIH and NSF databases for grant proposals solicited or funded from 2000-2012 to determine if the proportion of authors that listed USA addresses was associated with funding patterns. NIH did not solicit grant proposals designed to examine these factors in the context of athletic performance and neither NIH nor NSF funded grants designed to study these topics. We think the combined effects of a lack of government funding and the avoidance of studying controversial or non-fundable topics by USA based scientists are responsible for the observation that authors with USA addresses were underrepresented on scientific papers examining the relationships between athletic performance and innate variation.

  8. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  9. Radiological analyses of France Telecom surge arresters. Study performed for the CGT FAPT Cantal

    International Nuclear Information System (INIS)

    2010-02-01

    This document reports the radiological characterization of various versions of surge arresters used in the past to protect telephone lines against over-voltages. These equipment, which use various radioactive materials, were assessed by gamma radiation flow measurements, alpha-beta-gamma count rate measurements, dose rate measurements, gamma spectrometry analyses, tritium emanation test, radon 222 emanation test, smearing. Recommendations are formulated to manage radioactive surge arresters which are still being operated

  10. Selected problems and results of the transient event and reliability analyses for the German safety study

    International Nuclear Information System (INIS)

    Hoertner, H.

    1977-01-01

    For the investigation of the risk of nuclear power plants loss-of-coolant accidents and transients have to be analyzed. The different functions of the engineered safety features installed to cope with transients are explained. The event tree analysis is carried out for the important transient 'loss of normal onsite power'. Preliminary results of the reliability analyses performed for quantitative evaluation of this event tree are shown. (orig.) [de

  11. Acquisition, Analyses and Interpretation of fMRI Data: A Study on the Effective Connectivity in Human Primary Auditory Cortices

    International Nuclear Information System (INIS)

    Ahmad Nazlim Yusoff; Mazlyfarina Mohamad; Khairiah Abdul Hamid

    2011-01-01

    A study on the effective connectivity characteristics in auditory cortices was conducted on five healthy Malay male subjects with the age of 20 to 40 years old using functional magnetic resonance imaging (fMRI), statistical parametric mapping (SPM5) and dynamic causal modelling (DCM). A silent imaging paradigm was used to reduce the scanner sound artefacts on functional images. The subjects were instructed to pay attention to the white noise stimulus binaurally given at intensity level of 70 dB higher than the hearing level for normal people. Functional specialisation was studied using Matlab-based SPM5 software by means of fixed effects (FFX), random effects (RFX) and conjunction analyses. Individual analyses on all subjects indicate asymmetrical bilateral activation between the left and right auditory cortices in Brodmann areas (BA)22, 41 and 42 involving the primary and secondary auditory cortices. The three auditory areas in the right and left auditory cortices are selected for the determination of the effective connectivity by constructing 9 network models. The effective connectivity is determined on four out of five subjects with the exception of one subject who has the BA22 coordinates located too far from BA22 coordinates obtained from group analysis. DCM results showed the existence of effective connectivity between the three selected auditory areas in both auditory cortices. In the right auditory cortex, BA42 is identified as input centre with unidirectional parallel effective connectivities of BA42→BA41and BA42→BA22. However, for the left auditory cortex, the input is BA41 with unidirectional parallel effective connectivities of BA41→BA42 and BA41→BA22. The connectivity between the activated auditory areas suggests the existence of signal pathway in the auditory cortices even when the subject is listening to noise. (author)

  12. Extreme temperature indices analyses: A case study of five meteorological stations in Peninsular Malaysia

    Science.gov (United States)

    Hasan, Husna; Salleh, Nur Hanim Mohd

    2015-10-01

    Extreme temperature events affect many human and natural systems. Changes in extreme temperature events can be detected and monitored by developing the indices based on the extreme temperature data. As an effort to provide the understanding of these changes to the public, a study of extreme temperature indices is conducted at five meteorological stations in Peninsular Malaysia. In this study, changes in the means and extreme events of temperature are assessed and compared using the daily maximum and minimum temperature data for the period of 2004 to 2013. The absolute extreme temperature indices; TXx, TXn, TXn and TNn provided by Expert Team on Climate Change Detection and Indices (ETCCDI) are utilized and linear trends of each index are extracted using least square likelihood method. The results indicate that there exist significant decreasing trend in the TXx index for Kota Bharu station and increasing trend in TNn index for Chuping and Kota Kinabalu stations. The comparison between the trend in mean and extreme temperatures show the same significant tendency for Kota Bharu and Kuala Terengganu stations.

  13. Model-based performance and energy analyses of reverse osmosis to reuse wastewater in a PVC production site.

    Science.gov (United States)

    Hu, Kang; Fiedler, Thorsten; Blanco, Laura; Geissen, Sven-Uwe; Zander, Simon; Prieto, David; Blanco, Angeles; Negro, Carlos; Swinnen, Nathalie

    2017-11-10

    A pilot-scale reverse osmosis (RO) followed behind a membrane bioreactor (MBR) was developed for the desalination to reuse wastewater in a PVC production site. The solution-diffusion-film model (SDFM) based on the solution-diffusion model (SDM) and the film theory was proposed to describe rejections of electrolyte mixtures in the MBR effluent which consists of dominant ions (Na + and Cl - ) and several trace ions (Ca 2+ , Mg 2+ , K + and SO 4 2- ). The universal global optimisation method was used to estimate the ion permeability coefficients (B) and mass transfer coefficients (K) in SDFM. Then, the membrane performance was evaluated based on the estimated parameters which demonstrated that the theoretical simulations were in line with the experimental results for the dominant ions. Moreover, an energy analysis model with the consideration of limitation imposed by the thermodynamic restriction was proposed to analyse the specific energy consumption of the pilot-scale RO system in various scenarios.

  14. An Evaluation Quality Framework for Analysing School-Based Learning (SBL) to Work-Based Learning (WBL) Transition Module

    International Nuclear Information System (INIS)

    Alseddiqi, M; Mishra, R; Pislaru, C

    2012-01-01

    The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.

  15. Comprehensive review of genetic association studies and meta-analyses on miRNA polymorphisms and cancer risk.

    Directory of Open Access Journals (Sweden)

    Kshitij Srivastava

    Full Text Available MicroRNAs (miRNAs are small RNA molecules that regulate the expression of corresponding messenger RNAs (mRNAs. Variations in the level of expression of distinct miRNAs have been observed in the genesis, progression and prognosis of multiple human malignancies. The present study was aimed to investigate the association between four highly studied miRNA polymorphisms (mir-146a rs2910164, mir-196a2 rs11614913, mir-149 rs2292832 and mir-499 rs3746444 and cancer risk by using a two-sided meta-analytic approach.An updated meta-analysis based on 53 independent case-control studies consisting of 27573 cancer cases and 34791 controls was performed. Odds ratio (OR and 95% confidence interval (95% CI were used to investigate the strength of the association.Overall, the pooled analysis showed that mir-196a2 rs11614913 was associated with a decreased cancer risk (OR = 0.846, P = 0.004, TT vs. CC while other miRNA SNPs showed no association with overall cancer risk. Subgroup analyses based on type of cancer and ethnicity were also performed, and results indicated that there was a strong association between miR-146a rs2910164 and overall cancer risk in Caucasian population under recessive model (OR = 1.274, 95%CI = 1.096-1.481, P = 0.002. Stratified analysis by cancer type also associated mir-196a2 rs11614913 with lung and colorectal cancer at allelic and genotypic level.The present meta-analysis suggests an important role of mir-196a2 rs11614913 polymorphism with overall cancer risk especially in Asian population. Further studies with large sample size are needed to evaluate and confirm this association.

  16. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  17. PALEO-CHANNELS OF SINGKAWANG WATERS WEST KALIMANTAN AND ITS RELATION TO THE OCCURRENCES OF SUB-SEABOTTOM GOLD PLACERS BASED ON STRATA BOX SEISMIC RECORD ANALYSES

    Directory of Open Access Journals (Sweden)

    Hananto Kurnio

    2017-07-01

    Full Text Available Strata box seismic records were used to analyze sub-seabottom paleochannels in Singkawang Waters, West Kalimantan. Based on the analyses, it can be identified the distribution and patterns of paleochannels. Paleo channel at northern part of study area interpreted as a continuation of Recent coastal rivers; and at the southern part, the pattern radiates surround the cone-shaped morphology of islands, especially Kabung and Lemukutan Islands. Paleochannels of the study area belong to northwest Sunda Shelf systems that terminated to the South China Sea. A study on sequence stratigraphy was carried out to better understanding sedimentary sequences in the paleochannels. This study is also capable of identifying placer deposits within the channels. Based on criterias of gold placer occurrence such as existence of primary gold sources, intense chemical and physical weathering to liberate gold grains from their source rocks of Sintang Intrusive. Gravity transportation that involved water media, stable bed rock and surface conditions, caused offshore area of Singkawang fulfill requirements for gold placer accumulations. Chemical and physical whethering proccesses from Oligocene to Recent, approximately 36 million, might be found accumulation of gold placer on the seafloor. Based on grain size analyses, the study area consisted of sand 43.4%, silt 54.3% and clay 2.3%. Petrographic examination of the sample shows gold grains about 0.2%.

  18. How are rescaled range analyses affected by different memory and distributional properties? A Monte Carlo study

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2012-01-01

    Roč. 391, č. 17 (2012), s. 4252-4260 ISSN 0378-4371 R&D Projects: GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 118310; SVV(CZ) 261 501 Institutional support: RVO:67985556 Keywords : Rescaled range analysis * Modified rescaled range analysis * Hurst exponent * Long - term memory * Short- term memory Subject RIV: AH - Economics Impact factor: 1.676, year: 2012 http://library.utia.cas.cz/separaty/2012/E/kristoufek-how are rescaled range analyses affected by different memory and distributional properties.pdf

  19. Psychosocial Factors Related to Lateral and Medial Epicondylitis: Results From Pooled Study Analyses.

    Science.gov (United States)

    Thiese, Matthew S; Hegmann, Kurt T; Kapellusch, Jay; Merryweather, Andrew; Bao, Stephen; Silverstein, Barbara; Tang, Ruoliang; Garg, Arun

    2016-06-01

    The goal is to assess the relationships between psychosocial factors and both medial and lateral epicondylitis after adjustment for personal and job physical exposures. One thousand eight hundred twenty-four participants were included in pooled analyses. Ten psychosocial factors were assessed. One hundred twenty-one (6.6%) and 34 (1.9%) participants have lateral and medial epicondylitis, respectively. Nine psychosocial factors assessed had significant trends or associations with lateral epicondylitis, the largest of which was between physical exhaustion after work and lateral epicondylitis with and odds ratio of 7.04 (95% confidence interval = 2.02 to 24.51). Eight psychosocial factors had significant trends or relationships with medial epicondylitis, with the largest being between mental exhaustion after work with an odds ratio of 6.51 (95% confidence interval = 1.57 to 27.04). The breadth and strength of these associations after adjustment for confounding factors demonstrate meaningful relationships that need to be further investigated in prospective analyses.

  20. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  1. Pathways from fertility history to later life health: Results from analyses of the English Longitudinal Study of Ageing

    Directory of Open Access Journals (Sweden)

    Emily Grundy

    2015-01-01

    Full Text Available Background: Previous research shows associations between fertility histories and later life health. The childless, those with large families, and those with a young age at entry to parenthood generally have higher mortality and worse health than parents of two or three children. These associations are hypothesised to reflect a range of biosocial influences, but underlying mechanisms are poorly understood. Objective: To identify pathways from fertility histories to later life health by examining mediation through health-related behaviours, social support and strain, and wealth. Additionally to examine mediation through allostatic load - an indicator of multisystem physical dysregulation, hypothesised to be an outcome of chronic stress. Methods: Associations between fertility histories, mediators, and outcomes were analysed using path models. Data were drawn from the English Longitudinal Study of Ageing. Outcomes studied were a measure of allostatic load based on 9 biomarkers and self-reported long-term illness which limited activities. Results: Early parenthood (Conclusions: In England early parenthood and larger family size are associated with less wealth and poorer health behaviours and this accounts for much of the association with health. At least part of this operates through stress-related physiological dysfunction (allostatic load.

  2. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on {sup 123}I-MIBG uptake

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi [Kyushu University, Department of Clinical Radiology, Graduate School of Medical Sciences, Fukuoka (Japan); Yamaguchi, Hiroo; Kira, Jun-ichi [Kyushu University, Department of Neurology, Graduate School of Medical Sciences, Fukuoka (Japan)

    2017-12-15

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and {sup 123}I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and {sup 123}I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar {sup 123}I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of {sup 123}I-MIBG uptake in patients with PD. (orig.)

  3. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake

    International Nuclear Information System (INIS)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi; Yamaguchi, Hiroo; Kira, Jun-ichi

    2017-01-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of 123 I-MIBG uptake in patients with PD. (orig.)

  4. Dietary supplement use and colorectal cancer risk: A systematic review and meta-analyses of prospective cohort studies

    NARCIS (Netherlands)

    Heine-Bröring, R.C.; Winkels, R.M.; Renkema, J.M.S.; Kragt, L.; Orten-Luiten, van A.C.B.; Tigchelaar, E.F.; Chan, D.S.M.; Norat, T.; Kampman, E.

    2015-01-01

    Use of dietary supplements is rising in countries where colorectal cancer is prevalent. We conducted a systematic literature review and meta-analyses of prospective cohort studies on dietary supplement use and colorectal cancer risk. We identified relevant studies in Medline, Embase and Cochrane up

  5. Investigation of publication bias in meta-analyses of diagnostic test accuracy: a meta-epidemiological study

    NARCIS (Netherlands)

    van Enst, W. Annefloor; Ochodo, Eleanor; Scholten, Rob J. P. M.; Hooft, Lotty; Leeflang, Mariska M.

    2014-01-01

    The validity of a meta-analysis can be understood better in light of the possible impact of publication bias. The majority of the methods to investigate publication bias in terms of small study-effects are developed for meta-analyses of intervention studies, leaving authors of diagnostic test

  6. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  7. Ancient DNA analyses of museum specimens from selected Presbytis (primate: Colobinae) based on partial Cyt b sequences

    Science.gov (United States)

    Aifat, N. R.; Yaakop, S.; Md-Zain, B. M.

    2016-11-01

    The IUCN Red List of Threatened Species has categorized Malaysian primates from being data deficient to critically endanger. Thus, ancient DNA analyses hold great potential to understand phylogeny, phylogeography and population history of extinct and extant species. Museum samples are one of the alternatives to provide important sources of biological materials for a large proportion of ancient DNA studies. In this study, a total of six museum skin samples from species Presbytis hosei (4 samples) and Presbytis frontata (2 samples), aged between 43 and 124 years old were extracted to obtain the DNA. Extraction was done by using QIAGEN QIAamp DNA Investigator Kit and the ability of this kit to extract museum skin samples was tested by amplification of partial Cyt b sequence using species-specific designed primer. Two primer pairs were designed specifically for P. hosei and P. frontata, respectively. These primer pairs proved to be efficient in amplifying 200bp of the targeted species in the optimized PCR conditions. The performance of the sequences were tested to determine genetic distance of genus Presbytis in Malaysia. From the analyses, P. hosei is closely related to P. chrysomelas and P. frontata with the value of 0.095 and 0.106, respectively. Cyt b gave a clear data in determining relationships among Bornean species. Thus, with the optimized condition, museum specimens can be used for molecular systematic studies of the Malaysian primates.

  8. A fast plasma analyser for the study of the solar wind interaction with Mars

    Science.gov (United States)

    James, Adrian Martin

    This thesis describes the design and development of the FONEMA instrument to be flown aboard the Russian mission to Mars in 1996. Many probes have flown to Mars yet despite this many mysteries still remain, among them the nature of the interaction of the solar wind with the planetary obstacle. In this thesis I will present some of the results from earlier spacecraft and the models of the interaction that they suggest paying particular attention to the contribution of ion analysers. From these results it will become clear that a fast ion sensor is needed to resolve many of the questions about the magnetosphere of Mars. The FONEMA instrument was designed for this job making use of a novel electrostatic mirror and particle collimator combined with parallel magnetic and electrostatic fields to resolve the ions into mass and energy bins. Development and production of the individual elements is discussed in detail.

  9. Neutronics-processing interface analyses for the Accelerator Transmutation of Waste (ATW) aqueous-based blanket system

    International Nuclear Information System (INIS)

    Davidson, J.W.; Battat, M.E.

    1993-01-01

    Neutronics-processing interface parameters have large impacts on the neutron economy and transmutation performance of an aqueous-based Accelerator Transmutation of Waste (ATW) system. A detailed assessment of the interdependence of these blanket neutronic and chemical processing parameters has been performed. Neutronic performance analyses require that neutron transport calculations for the ATW blanket systems be fully coupled with the blanket processing and include all neutron absorptions in candidate waste nuclides as well as in fission and transmutation products. The effects of processing rates, flux levels, flux spectra, and external-to-blanket inventories on blanket neutronic performance were determined. In addition, the inventories and isotopics in the various subsystems were also calculated for various actinide and long-lived fission product transmutation strategies

  10. Multi-person and multi-attribute design evaluations using evidential reasoning based on subjective safety and cost analyses

    International Nuclear Information System (INIS)

    Wang, J.; Yang, J.B.; Sen, P.

    1996-01-01

    This paper presents an approach for ranking proposed design options based on subjective safety and cost analyses. Hierarchical system safety analysis is carried out using fuzzy sets and evidential reasoning. This involves safety modelling by fuzzy sets at the bottom level of a hierarchy and safety synthesis by evidential reasoning at higher levels. Fuzzy sets are also used to model the cost incurred for each design option. An evidential reasoning approach is then employed to synthesise the estimates of safety and cost, which are made by multiple designers. The developed approach is capable of dealing with problems of multiple designers, multiple attributes and multiple design options to select the best design. Finally, a practical engineering example is presented to demonstrate the proposed multi-person and multi-attribute design selection approach

  11. Formalisation des bases méthodologiques et conceptuelles d'une analyse spatiale des accidents de la route

    Directory of Open Access Journals (Sweden)

    Florence Huguenin Richard

    1999-06-01

    Full Text Available Cet article pose les bases méthodologiques et conceptuelles d’une analyse spatiale du risque routier. L’étude de ce phénomène requiert une masse importante de données qui décrivent différentes dimensions de l’accident et qui peuvent être gérées dans un système d’information géographique. Elle demande aussi une réflexion méthodologique sur la cartographie du risque, les échelles d’observation, l’agrégation de données qualitatives et quantitatives, l’utilisation de méthodes statistiques adaptées au risque routier et l'intégration de l’espace comme facteur d’insécurité.

  12. Studies and Analyses of Aided Adversarial Decision Making. Phase 2: Research on Human Trust in Automation

    National Research Council Canada - National Science Library

    Llinas, James

    1998-01-01

    .... Given that offensive IW operations may interfere with automated, data-fusion based decision aids, it is necessary to understand how personnel may rely on or trust these aids when appropriate (e.g...

  13. Large-scale genome-wide association studies and meta-analyses of longitudinal change in adult lung function.

    Directory of Open Access Journals (Sweden)

    Wenbo Tang

    Full Text Available Genome-wide association studies (GWAS have identified numerous loci influencing cross-sectional lung function, but less is known about genes influencing longitudinal change in lung function.We performed GWAS of the rate of change in forced expiratory volume in the first second (FEV1 in 14 longitudinal, population-based cohort studies comprising 27,249 adults of European ancestry using linear mixed effects model and combined cohort-specific results using fixed effect meta-analysis to identify novel genetic loci associated with longitudinal change in lung function. Gene expression analyses were subsequently performed for identified genetic loci. As a secondary aim, we estimated the mean rate of decline in FEV1 by smoking pattern, irrespective of genotypes, across these 14 studies using meta-analysis.The overall meta-analysis produced suggestive evidence for association at the novel IL16/STARD5/TMC3 locus on chromosome 15 (P  =  5.71 × 10(-7. In addition, meta-analysis using the five cohorts with ≥3 FEV1 measurements per participant identified the novel ME3 locus on chromosome 11 (P  =  2.18 × 10(-8 at genome-wide significance. Neither locus was associated with FEV1 decline in two additional cohort studies. We confirmed gene expression of IL16, STARD5, and ME3 in multiple lung tissues. Publicly available microarray data confirmed differential expression of all three genes in lung samples from COPD patients compared with controls. Irrespective of genotypes, the combined estimate for FEV1 decline was 26.9, 29.2 and 35.7 mL/year in never, former, and persistent smokers, respectively.In this large-scale GWAS, we identified two novel genetic loci in association with the rate of change in FEV1 that harbor candidate genes with biologically plausible functional links to lung function.

  14. Shinguards effective in preventing lower leg injuries in football: Population-based trend analyses over 25 years.

    Science.gov (United States)

    Vriend, Ingrid; Valkenberg, Huib; Schoots, Wim; Goudswaard, Gert Jan; van der Meulen, Wout J; Backx, Frank J G

    2015-09-01

    The majority of football injuries are caused by trauma to the lower extremities. Shinguards are considered an important measure in preventing lower leg impact abrasions, contusions and fractures. Given these benefits, Fédération Internationale de Football Association introduced the shinguard law in 1990, which made wearing shinguards during matches mandatory. This study evaluated the effect of the introduction of the shinguard law for amateur players in the Netherlands in the 1999/2000-football season on the incidence of lower leg injuries. Time trend analyses on injury data covering 25 years of continuous registration (1986-2010). Data were retrieved from a system that records all emergency department treatments in a random, representative sample of Dutch hospitals. All injuries sustained in football by patients aged 6-65 years were included, except for injuries of the Achilles tendon and Weber fractures. Time trends were analysed with multiple regression analyses; a model was fitted consisting of multiple straight lines, each representing a 5-year period. Patients were predominantly males (92%) and treated for fractures (48%) or abrasions/contusions (52%) to the lower leg. The incidence of lower leg football injuries decreased significantly following the introduction of the shinguard law (1996-2000: -20%; 2001-2005: -25%), whereas the incidence of all other football injuries did not. This effect was more prominent at weekends/match days. No gender differences were found. The results significantly show a preventive effect of the shinguard law underlining the relevance of rule changes as a preventive measure and wearing shinguards during both matches and training sessions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  15. Food web functioning of the benthopelagic community in a deep-sea seamount based on diet and stable isotope analyses

    Science.gov (United States)

    Preciado, Izaskun; Cartes, Joan E.; Punzón, Antonio; Frutos, Inmaculada; López-López, Lucía; Serrano, Alberto

    2017-03-01

    Trophic interactions in the deep-sea fish community of the Galicia Bank seamount (NE Atlantic) were inferred by using stomach contents analyses (SCA) and stable isotope analyses (SIA) of 27 fish species and their main prey items. Samples were collected during three surveys performed in 2009, 2010 and 2011 between 625 and 1800 m depth. Three main trophic guilds were determined using SCA data: pelagic, benthopelagic and benthic feeders, respectively. Vertically migrating macrozooplankton and meso-bathypelagic shrimps were identified to play a key role as pelagic prey for the deep sea fish community of the Galicia Bank. Habitat overlap was hardly detected; as a matter of fact, when species coexisted most of them evidenced a low dietary overlap, indicating a high degree of resource partitioning. A high potential competition, however, was observed among benthopelagic feeders, i.e.: Etmopterus spinax, Hoplostethus mediterraneus and Epigonus telescopus. A significant correlation was found between δ15N and δ13C for all the analysed species. When calculating Trophic Levels (TLs) for the main fish species, using both the SCA and SIA approaches, some discrepancies arose: TLs calculated from SIA were significantly higher than those obtained from SCA, probably indicating a higher consumption of benthic-suprabenthic prey in the previous months. During the summer, food web functioning in the Galicia Bank was more influenced by the assemblages dwelling in the water column than by deep-sea benthos, which was rather scarce in the summer samples. These discrepancies demonstrate the importance of using both approaches, SCA (snapshot of diet) and SIA (assimilated food in previous months), when attempting trophic studies, if an overview of food web dynamics in different compartments of the ecosystem is to be obtained.

  16. A Study of Two Multi-Element Resonant DC-DC Topologies with Loss Distribution Analyses

    Directory of Open Access Journals (Sweden)

    Yifeng Wang

    2017-09-01

    Full Text Available In this paper, two multi-element resonant DC-DC converters are analyzed in detail. Since their resonant tanks have multiple resonant components, the converters display different resonant characteristics within different operating frequency ranges. Through appropriate design, both of the two proposed converters successfully lower the conversion losses and, meanwhile, broaden the voltage gain ranges as well: one converter is able to take full usage of the third order harmonic to deliver the active power, and thus the effective utilization rate of the resonant current is elevated; while the another minimizes the entire switching losses for power switching devices by restricting the input impedance angle of the resonant tank. Besides, the loss distribution is analyzed for the purpose of guiding the component design. In the end, two 500 W prototypes are fabricated to test the theoretical analyses. The results demonstrate that the two proposed converters can achieve wide voltage gain with the small frequency deviation, which noticeably contributes to highly efficient conversion. Their peak efficiencies are measured as 95.4% and 95.3%, respectively.

  17. Synthesis, spectroscopic characterization, biological screenings, DNA binding study and POM analyses of transition metal carboxylates

    Science.gov (United States)

    Uddin, Noor; Sirajuddin, Muhammad; Uddin, Nizam; Tariq, Muhammad; Ullah, Hameed; Ali, Saqib; Tirmizi, Syed Ahmed; Khan, Abdur Rehman

    2015-04-01

    This article contains the synthesis of a novel carboxylic acid derivative, its transition metal complexes and evaluation of biological applications. Six carboxylate complexes of transition metals, Zn(II) and Hg(II), have been successfully synthesized and characterized by FT-IR and NMR (1H, 13C). The ligand, HL, (4-[(2,6-Diethylphenyl)amino]-4-oxobutanoic acid) was also characterized by single crystal X-ray analysis. The complexation occurs via oxygen atoms of the carboxylate moiety. FT-IR date show the bidentate nature of the carboxylate moiety of the ligand as the Δν value in all complexes is less than that of the free ligand. The ligand and its complexes were screened for antifungal and antileishmanial activities. The results showed that the ligand and its complexes are active with few exceptions. UV-visible spectroscopy and viscometry results reveal that the ligand and its complexes interact with the DNA via intercalative mode of interaction. A new and efficient strategy to identify the pharmacophores and anti-pharmacophores sites in carboxylate derivatives for the antibacterial/antifungal activity using Petra, Osiris and Molinspiration (POM) analyses was also carried out.

  18. Identification and characterization of rock slope instabilities in Val Canaria (TI, Switzerland) based on field and DEM analyses

    Science.gov (United States)

    Ponzio, Maria; Pedrazzini, Andrea; Matasci, Battista; Jaboyedoff, Michel

    2013-04-01

    In Alpine areas rockslides and rock avalanches represent common gravitational hazards that potentially constitute a danger for people and infrastructures. The aim of this study is to characterize and understand the different factors influencing the distribution of large slope instabilities affecting the Val Canaria (southern Switzerland). In particular the importance of the tectonic and lithological settings as well as the impact of the groundwater circulations are investigated in detail. Val Canaria is a SW-NE trending lateral valley that displays potential large rock slope failure. Located just above one of the main N-S communication way (Highway, Railway) through the Alps, the development of large instabilities in the Val Canaria might have dramatic consequences for the main valley downstream. The dominant geological structure of the study area is the presence of a major tectonic boundary separating two basement nappes, constituted by gneissic lithologies, i.e. the Gotthard massif and the Lucomagno nappe that are located in the northern and southern part of the valley respectively. The basement units are separated by meta-sediments of Piora syncline composed by gypsum, dolomitic breccia and fractured calc-mica schists. Along with detailed geological mapping, the use of remote sensing techniques (Aerial and Terrestrial Laser Scanning) allows us to propose a multi-disciplinary approach that combines geological mapping and interpretation with periodic monitoring of the most active rockslide areas. A large array of TLS point cloud datasets (first acquisition in 2006) constitute a notable input, for monitoring purposes, and also for structural, rock mass characterization and failure mechanism interpretations. The analyses highlighted that both valley flanks are affected by deep-seated gravitational slope deformation covering a total area of about 8 km2 (corresponding to 40% of the catchment area). The most active area corresponds to the lower part of the valley

  19. Analyses of the studies on cancer-related quality of life published in Korea

    International Nuclear Information System (INIS)

    Lee, Eun Hyun; Park, Hee Boong; Kim, Myung Wook; Kang, Sung Hee; Chun, Mi Son; Lee, Hye Jin; Lee, Won Hee

    2002-01-01

    The purpose of the present study was to analyze and evaluate prior studies published in Korea on the cancer-related quality of life, in order to make recommendations for further research. A total of 31 studies were selected from three different databases. The selected studies were analyzed according to 11 criteria, such as site of cancer, domain, independent variable, research design, self/proxy rating, single/battery instrument, translation/back translation, reliability, validity, scoring, and findings. Of the 31 studies, approximately half of them were conducted using a mixed cancer group of patient. Many of the studies asserted that the concept of quality of life had a multidimensional attribute. Approximately 30% were longitudinal design studies giving information about the changes in quality of life. In all studies, except one, patients directly rated their level of quality of life. With respect to the questionnaires used for measuring the quality of life, most studies did not consider whether or not their reliability and validity had been established. In addition, when using questionnaires developed in other languages, no studies employed a translation/back-translation technique. All studies used sum or total scoring methods when calculating the level of quality of life. The types of variables tested for their influence on quality of life were quite limited. It is recommended that longitudinal design studies be performed, using methods of data collection whose validity and reliability has been confirmed, and that studies be conducted to identify new variables having an influence on the quality of life

  20. Population-based cost-offset analyses for disorder-specific treatment of anorexia nervosa and bulimia nervosa in Germany.

    Science.gov (United States)

    Bode, Katharina; Götz von Olenhusen, Nina Maria; Wunsch, Eva-Maria; Kliem, Sören; Kröger, Christoph

    2017-03-01

    Previous research has shown that anorexia nervosa (AN) and bulimia nervosa (BN) are expensive illnesses to treat. To reduce their economic burden, adequate interventions need to be established. Our objective was to conduct cost-offset analyses for evidence-based treatment of eating disorders using outcome data from a psychotherapy trial involving cognitive behavioral therapy (CBT) and focal psychodynamic therapy (FPT) for AN and a trial involving CBT for BN. Assuming a currently running, ideal healthcare system using a 12-month, prevalence-based approach and varying the willingness to participate in treatment, we investigated whether the potential financial benefits of AN- and BN-related treatment outweigh the therapy costs at the population level. We elaborated on a formula that allows calculating cost-benefit relationships whereby the calculation of the parameters is based on estimates from data of health institutions within the German healthcare system. Additional intangible benefits were calculated with the aid of Quality-Adjusted Life Years. The annual costs of an untreated eating disorder were 2.38 billion EUR for AN and 617.69 million EUR for BN. Independent of the willingness to participate in treatment, the cost-benefit relationships for the treatment remained constant at 2.51 (CBT) and 2.33 (FPT) for AN and 4.05 (CBT) for BN. This consistency implies that for each EUR invested in the treatment, between 2.33 and 4.05 EUR could be saved each year. Our findings suggest that the implementation of evidence-based psychotherapy treatments for AN and BN may achieve substantial cost savings at the population level. © 2017 Wiley Periodicals, Inc.

  1. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0.

    Science.gov (United States)

    Cong, Jing; Liu, Xueduan; Lu, Hui; Xu, Han; Li, Yide; Deng, Ye; Li, Diqiang; Zhang, Yuguang

    2015-09-01

    To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171).

  2. Clustering structures of large proteins using multifractal analyses based on a 6-letter model and hydrophobicity scale of amino acids

    International Nuclear Information System (INIS)

    Yang Jianyi; Yu Zuguo; Anh, Vo

    2009-01-01

    The Schneider and Wrede hydrophobicity scale of amino acids and the 6-letter model of protein are proposed to study the relationship between the primary structure and the secondary structural classification of proteins. Two kinds of multifractal analyses are performed on the two measures obtained from these two kinds of data on large proteins. Nine parameters from the multifractal analyses are considered to construct the parameter spaces. Each protein is represented by one point in these spaces. A procedure is proposed to separate large proteins in the α, β, α + β and α/β structural classes in these parameter spaces. Fisher's linear discriminant algorithm is used to assess our clustering accuracy on the 49 selected large proteins. Numerical results indicate that the discriminant accuracies are satisfactory. In particular, they reach 100.00% and 84.21% in separating the α proteins from the {β, α + β, α/β} proteins in a parameter space; 92.86% and 86.96% in separating the β proteins from the {α + β, α/β} proteins in another parameter space; 91.67% and 83.33% in separating the α/β proteins from the α + β proteins in the last parameter space.

  3. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.; Haroon, Mohamed; Ngugi, David; Thompson, Luke R.; Stingl, Ulrich

    2016-01-01

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  4. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.

    2016-06-25

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  5. Sampling strategies for improving tree accuracy and phylogenetic analyses: a case study in ciliate protists, with notes on the genus Paramecium.

    Science.gov (United States)

    Yi, Zhenzhen; Strüder-Kypke, Michaela; Hu, Xiaozhong; Lin, Xiaofeng; Song, Weibo

    2014-02-01

    In order to assess how dataset-selection for multi-gene analyses affects the accuracy of inferred phylogenetic trees in ciliates, we chose five genes and the genus Paramecium, one of the most widely used model protist genera, and compared tree topologies of the single- and multi-gene analyses. Our empirical study shows that: (1) Using multiple genes improves phylogenetic accuracy, even when their one-gene topologies are in conflict with each other. (2) The impact of missing data on phylogenetic accuracy is ambiguous: resolution power and topological similarity, but not number of represented taxa, are the most important criteria of a dataset for inclusion in concatenated analyses. (3) As an example, we tested the three classification models of the genus Paramecium with a multi-gene based approach, and only the monophyly of the subgenus Paramecium is supported. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Post test analyses of Revisa benchmark based on a creep test at 1100 Celsius degrees performed on a notched tube

    International Nuclear Information System (INIS)

    Fischer, M.; Bernard, A.; Bhandari, S.

    2001-01-01

    In the Euratom 4. Framework Program of the European Commission, REVISA Project deals with the Reactor Vessel Integrity under Severe Accidents. One of the tasks consists in the experimental validation of the models developed in the project. To do this, a benchmark was designed where the participants use their models to test the results against an experiment. The experiment called RUPTHER 15 was conducted by the coordinating organisation, CEA (Commissariat a l'Energie Atomique) in France. It is a 'delayed fracture' test on a notched tube. Thermal loading is an axial gradient with a temperature of about 1130 C in the mid-part. Internal pressure is maintained at 0.8 MPa. This paper presents the results of Finite Element calculations performed by Framatome-ANP using the SYSTUS code. Two types of analyses were made: -) one based on the 'time hardening' Norton-Bailey creep law, -) the other based on the coupled creep/damage Lemaitre-Chaboche model. The purpose of this paper is in particular to show the influence of temperature on the simulation results. At high temperatures of the kind dealt with here, slight errors in the temperature measurements can lead to very large differences in the deformation behaviour. (authors)

  7. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    Science.gov (United States)

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  8. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Weber, T., E-mail: thomas.weber@physik.uni-erlangen.de [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Bartl, P.; Durst, J. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Haas, W. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); University of Erlangen-Nuremberg, Pattern Recognition Lab, Martensstr. 3, 91058 Erlangen (Germany); Michel, T.; Ritter, A.; Anton, G. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany)

    2011-08-21

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary. With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases. These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool 'SPHINX', combining both wave and particle contributions of the simulated photons. The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant. Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements. This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  9. Cost consequences due to reduced ulcer healing times - analyses based on the Swedish Registry of Ulcer Treatment.

    Science.gov (United States)

    Öien, Rut F; Forssell, Henrik; Ragnarson Tennvall, Gunnel

    2016-10-01

    Resource use and costs for topical treatment of hard-to-heal ulcers based on data from the Swedish Registry of Ulcer Treatment (RUT) were analysed in patients recorded in RUT as having healed between 2009 and 2012, in order to estimate potential cost savings from reductions in frequency of dressing changes and healing times. RUT is used to capture areas of improvement in ulcer care and to enable structured wound management by registering patients with hard-to-heal leg, foot and pressure ulcers. Patients included in the registry are treated in primary care, community care, private care, and inpatient hospital care. Cost calculations were based on resource use data on healing time and frequency of dressing changes in Swedish patients with hard-to-heal ulcers who healed between 2009 and 2012. Per-patient treatment costs decreased from SEK38 223 in 2009 to SEK20 496 in 2012, mainly because of shorter healing times. Frequency of dressing changes was essentially the same during these years, varying from 1·4 to 1·6 per week. The total healing time was reduced by 38%. Treatment costs for the management of hard-to-heal ulcers can be reduced with well-developed treatment strategies resulting in shortened healing times as shown in RUT. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  10. Free Vibration Analyses of FGM Thin Plates by Isogeometric Analysis Based on Classical Plate Theory and Physical Neutral Surface

    Directory of Open Access Journals (Sweden)

    Shuohui Yin

    2013-01-01

    Full Text Available The isogeometric analysis with nonuniform rational B-spline (NURBS based on the classical plate theory (CPT is developed for free vibration analyses of functionally graded material (FGM thin plates. The objective of this work is to provide an efficient and accurate numerical simulation approach for the nonhomogeneous thin plates and shells. Higher order basis functions can be easily obtained in IGA, thus the formulation of CPT based on the IGA can be simplified. For the FGM thin plates, material property gradient in the thickness direction is unsymmetrical about the midplane, so effects of midplane displacements cannot be ignored, whereas the CPT neglects midplane displacements. To eliminate the effects of midplane displacements without introducing new unknown variables, the physical neutral surface is introduced into the CPT. The approximation of the deflection field and the geometric description are performed by using the NURBS basis functions. Compared with the first-order shear deformation theory, the present method has lower memory consumption and higher efficiency. Several numerical results show that the present method yields highly accurate solutions.

  11. Performance study of K{sub e} factors in simplified elastic plastic fatigue analyses with emphasis on thermal cyclic loading

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Hermann, E-mail: hermann.lang@areva.com [AREVA NP GmbH, PEEA-G, Henri-Dunant-Strasse 50, 91058 Erlangen (Germany); Rudolph, Juergen; Ziegler, Rainer [AREVA NP GmbH, PEEA-G, Henri-Dunant-Strasse 50, 91058 Erlangen (Germany)

    2011-08-15

    As code-based fully elastic plastic code conforming fatigue analyses are still time consuming, simplified elastic plastic analysis is often applied. This procedure is known to be overly conservative for some conditions due to the applied plastification (penalty) factor K{sub e}. As a consequence, less conservative fully elastic plastic fatigue analyses based on non-linear finite element analyses (FEA) or simplified elastic plastic analysis based on more realistic K{sub e} factors have to be used for fatigue design. The demand for more realistic K{sub e} factors is covered as a requirement of practical fatigue analysis. Different code-based K{sub e} procedures are reviewed in this paper with special regard to performance under thermal cyclic loading conditions. Other approximation formulae such as those by Neuber, Seeger/Beste or Kuehnapfel are not evaluated in this context because of their applicability to mechanical loading excluding thermal cyclic loading conditions typical for power plant operation. Besides the current code-based K{sub e} corrections, the ASME Code Case N-779 (e.g. Adam's proposal) and its modification in ASME Section VIII is considered. Comparison of elastic plastic results and results from the Rules for Nuclear Facility Components and Rules for Pressure Vessels reveals a considerable overestimation of usage factor in the case of ASME III and KTA 3201.2 for the examined examples. Usage factors according to RCC-M, Adams (ASME Code Case N-779), ASME VIII (alternative) and EN 13445-3 are essentially comparable and less conservative for these examples. The K{sub v} correction as well as the applied yield criterion (Tresca or von Mises) essentially influence the quality of the more advanced plasticity corrections (e.g. ASME Code Case N-779 and RCC-M). Hence, new proposals are based on a refined K{sub v} correction.

  12. Performance study of Ke factors in simplified elastic plastic fatigue analyses with emphasis on thermal cyclic loading

    International Nuclear Information System (INIS)

    Lang, Hermann; Rudolph, Juergen; Ziegler, Rainer

    2011-01-01

    As code-based fully elastic plastic code conforming fatigue analyses are still time consuming, simplified elastic plastic analysis is often applied. This procedure is known to be overly conservative for some conditions due to the applied plastification (penalty) factor K e . As a consequence, less conservative fully elastic plastic fatigue analyses based on non-linear finite element analyses (FEA) or simplified elastic plastic analysis based on more realistic K e factors have to be used for fatigue design. The demand for more realistic K e factors is covered as a requirement of practical fatigue analysis. Different code-based K e procedures are reviewed in this paper with special regard to performance under thermal cyclic loading conditions. Other approximation formulae such as those by Neuber, Seeger/Beste or Kuehnapfel are not evaluated in this context because of their applicability to mechanical loading excluding thermal cyclic loading conditions typical for power plant operation. Besides the current code-based K e corrections, the ASME Code Case N-779 (e.g. Adam's proposal) and its modification in ASME Section VIII is considered. Comparison of elastic plastic results and results from the Rules for Nuclear Facility Components and Rules for Pressure Vessels reveals a considerable overestimation of usage factor in the case of ASME III and KTA 3201.2 for the examined examples. Usage factors according to RCC-M, Adams (ASME Code Case N-779), ASME VIII (alternative) and EN 13445-3 are essentially comparable and less conservative for these examples. The K v correction as well as the applied yield criterion (Tresca or von Mises) essentially influence the quality of the more advanced plasticity corrections (e.g. ASME Code Case N-779 and RCC-M). Hence, new proposals are based on a refined K v correction.

  13. Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.

    2017-10-01

    In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.

  14. Correcting for multivariate measurement error by regression calibration in meta-analyses of epidemiological studies

    DEFF Research Database (Denmark)

    Tybjærg-Hansen, Anne

    2009-01-01

    Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements...... of the risk factors are observed on a subsample. We extend the multivariate RC techniques to a meta-analysis framework where multiple studies provide independent repeat measurements and information on disease outcome. We consider the cases where some or all studies have repeat measurements, and compare study......-specific, averaged and empirical Bayes estimates of RC parameters. Additionally, we allow for binary covariates (e.g. smoking status) and for uncertainty and time trends in the measurement error corrections. Our methods are illustrated using a subset of individual participant data from prospective long-term studies...

  15. Applying species-tree analyses to deep phylogenetic histories: challenges and potential suggested from a survey of empirical phylogenetic studies.

    Science.gov (United States)

    Lanier, Hayley C; Knowles, L Lacey

    2015-02-01

    Coalescent-based methods for species-tree estimation are becoming a dominant approach for reconstructing species histories from multi-locus data, with most of the studies examining these methodologies focused on recently diverged species. However, deeper phylogenies, such as the datasets that comprise many Tree of Life (ToL) studies, also exhibit gene-tree discordance. This discord may also arise from the stochastic sorting of gene lineages during the speciation process (i.e., reflecting the random coalescence of gene lineages in ancestral populations). It remains unknown whether guidelines regarding methodologies and numbers of loci established by simulation studies at shallow tree depths translate into accurate species relationships for deeper phylogenetic histories. We address this knowledge gap and specifically identify the challenges and limitations of species-tree methods that account for coalescent variance for deeper phylogenies. Using simulated data with characteristics informed by empirical studies, we evaluate both the accuracy of estimated species trees and the characteristics associated with recalcitrant nodes, with a specific focus on whether coalescent variance is generally responsible for the lack of resolution. By determining the proportion of coalescent genealogies that support a particular node, we demonstrate that (1) species-tree methods account for coalescent variance at deep nodes and (2) mutational variance - not gene-tree discord arising from the coalescent - posed the primary challenge for accurate reconstruction across the tree. For example, many nodes were accurately resolved despite predicted discord from the random coalescence of gene lineages and nodes with poor support were distributed across a range of depths (i.e., they were not restricted to a particular recent divergences). Given their broad taxonomic scope and large sampling of taxa, deep level phylogenies pose several potential methodological complications including

  16. Understanding ageing in older Australians: The contribution of the Dynamic Analyses to Optimise Ageing (DYNOPTA) project to the evidenced base and policy

    Science.gov (United States)

    Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D

    2014-01-01

    Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767

  17. Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.

    Science.gov (United States)

    Onder, Seyhan; Mutlu, Mert

    2017-09-01

    Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.

  18. Study on mixing behavior in a tee piping and numerical analyses for evaluation of thermal striping

    International Nuclear Information System (INIS)

    Kamide, H.; Igarashi, M.; Kawashima, S.; Kimura, N.; Hayashi, K.

    2009-01-01

    Water experiments were carried out for thermal hydraulic aspects of thermal striping in a mixing tee, which has main to branch diameter ratio of 3. Detailed temperature and velocity fields were measured by a movable thermocouple tree and particle image velocimetry. Flow patterns in the tee were classified into three groups; wall jet, deflecting jet, and impinging jet, which had their own temperature fluctuation profiles, depending on a momentum ratio between the main and branch pipes. Non-dimensional power spectrum density (PSD) of temperature fluctuation showed a unique profile, when the momentum ratio was identical. Numerical simulation based on finite difference method showed alternative vortex development, like Karman vortex series, behind the jet from the branch pipe in the wall jet case. The prominent frequency of the temperature fluctuation in the calculation was 0.2 of St number based on the branch pipe diameter and in good agreement with the experimental results. Mixing behavior in the tee was characterized by the relatively large vortex structures defined by the diameters and the velocities in the pipes

  19. Single camera analyses in studying pattern forming dynamics of player interactions in team sports.

    OpenAIRE

    Duarte, Ricardo; Fernandes, Orlando; Folgado, Hugo; Araújo, Duarte

    2013-01-01

    A network of patterned interactions between players characterises team ball sports. Thus, interpersonal coordination patterns are an important topic in the study of performance in such sports. A very useful method has been the study of inter-individual interactions captured by a single camera filming an extended performance area. The appropriate collection of positional data allows investigating the pattern forming dynamics emerging in different performance sub-phases of team ball sports. Thi...

  20. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    Directory of Open Access Journals (Sweden)

    Ester Vilaprinyo

    Full Text Available The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1 To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2 To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial, the starting ages (40, 45 and 50 years and the ending ages (69 and 74 years in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  1. Application of Markowitz model in analysing risk and return a case study of BSE stock

    Directory of Open Access Journals (Sweden)

    Manas Pandey

    2012-03-01

    Full Text Available In this paper the optimal portfolio formation using real life data subject to two different constraint sets is attempted. It is a theoretical framework for the analysis of risk return choices. Decisions are based on the concept of efficient portfolios. Markowitz portfolio analysis gives as output an efficient frontier on which each portfolio is the highest return earning portfolio for a specified level of risk. The investors can reduce their risks and can maximize their return from the investment, The Markowitz portfolio selections were obtained by solving the portfolio optimization problems to get maximum total returns, constrained by minimum allowable risk level. Investors can get lot of information knowledge about how to invest when to invest and why to invest in the particular portfolio. It basically calculates the standard deviation and returns for each of the feasible portfolios and identifies the efficient frontier, the boundary of the feasible portfolios of increasing returns

  2. Chemical and magnetic analyses on tree bark as an effective tool for biomonitoring: A case study in Lisbon (Portugal).

    Science.gov (United States)

    Brignole, Daniele; Drava, Giuliana; Minganti, Vincenzo; Giordani, Paolo; Samson, Roeland; Vieira, Joana; Pinho, Pedro; Branquinho, Cristina

    2018-03-01

    Tree bark has proven to be a reliable tool for biomonitoring deposition of metals from the atmosphere. The aim of the present study was to test if bark magnetic properties can be used as a proxy of the overall metal loads of a tree bark, meaning that this approach can be used to discriminate different effects of pollution on different types of urban site. In this study, the concentrations of As, Cd, Co, Cu, Fe, Mn, Ni, P, Pb, V and Zn were measured by ICP-OES in bark samples of Jacaranda mimosifolia, collected along roads and in urban green spaces in the city of Lisbon (Portugal). Magnetic analyses were also performed on the same bark samples, measuring Isothermal Remanent Magnetization (IRM), Saturation Isothermal Remanent Magnetization (SIRM) and Magnetic Susceptibility (χ). The results confirmed that magnetic analyses can be used as a proxy of the overall load of trace elements in tree bark, and could be used to distinguish different types of urban sites regarding atmospheric pollution. Together with trace element analyses, magnetic analyses could thus be used as a tool to provide high-resolution data on urban air quality and to follow up the success of mitigation actions aiming at decreasing the pollutant load in urban environments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2013-03-01

    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  4. Correcting for multivariate measurement error by regression calibration in meta-analyses of epidemiological studies.

    NARCIS (Netherlands)

    Kromhout, D.

    2009-01-01

    Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements of the

  5. Psychiatric genome-wide association study analyses implicate neuronal, immune and histone pathways

    NARCIS (Netherlands)

    O'Dushlaine, Colm; Rossin, Lizzy; Lee, Phil H.; Duncan, Laramie; Parikshak, Neelroop N.; Newhouse, Stephen; Ripke, Stephan; Neale, Benjamin M.; Purcell, Shaun M.; Posthuma, Danielle; Nurnberger, John I.; Lee, S. Hong; Faraone, Stephen V.; Perlis, Roy H.; Mowry, Bryan J.; Thapar, Anita; Goddard, Michael E.; Witte, John S.; Absher, Devin; Agartz, Ingrid; Akil, Huda; Amin, Farooq; Andreassen, Ole A.; Anjorin, Adebayo; Anney, Richard; Anttila, Verneri; Arking, Dan E.; Asherson, Philip; Azevedo, Maria H.; Backlund, Lena; Badner, Judith A.; Bailey, Anthony J.; Banaschewski, Tobias; Barchas, Jack D.; Barnes, Michael R.; Barrett, Thomas B.; Bass, Nicholas; Battaglia, Agatino; Bauer, Michael; Bayes, Monica; Bellivier, Frank; Bergen, Sarah E.; Berrettini, Wade; Betancur, Catalina; Bettecken, Thomas; Biederman, Joseph; Binder, Elisabeth B.; Bruggeman, Richard; Nolen, Willem A.; Penninx, Brenda W.

    Genome-wide association studies (GWAS) of psychiatric disorders have identified multiple genetic associations with such disorders, but better methods are needed to derive the underlying biological mechanisms that these signals indicate. We sought to identify biological pathways in GWAS data from

  6. Psychiatric genome-wide association study analyses implicate neuronal, immune and histone pathways

    DEFF Research Database (Denmark)

    O'Dushlaine, Colm; Rossin, Lizzy; Lee, Phil H.

    2015-01-01

    Genome-wide association studies (GWAS) of psychiatric disorders have identified multiple genetic associations with such disorders, but better methods are needed to derive the underlying biological mechanisms that these signals indicate. We sought to identify biological pathways in GWAS data from ...

  7. Higher Education Business Management Staff and the MBA: A Small Study Analysing Intrinsic and Extrinsic Benefits

    Science.gov (United States)

    Gander, Michelle

    2015-01-01

    Higher education is a key sector for the United Kingdom contributing over £70 billion of output. It functions in an increasingly complex operating, regulatory, and legislative environment that has led to an increased need for effective nonacademic business managers. This study evaluates the benefits of a specialist master of business…

  8. Solar power satellite system definition study. Volume 7, phase 1: SPS and rectenna systems analyses

    Science.gov (United States)

    1979-01-01

    A systems definition study of the solar power satellite systems is presented. The design and power distribution of the rectenna system is discussed. The communication subsystem and thermal control characteristics are described and a failure analysis performed on the systems is reported.

  9. Rasch Analyses of Very Low Food Security among Households and Children in the Three City Study.

    Science.gov (United States)

    Moffitt, Robert A; Ribar, David C

    2016-04-01

    The longitudinal Three City Study of low-income families with children measures food hardships using fewer questions and some different questions from the standard U.S. instrument for measuring food security, the Household Food Security Survey Module (HFSSM) in the Current Population Survey (CPS). We utilize a Rasch measurement model to identify thresholds of very low food security among households and very low food security among children in the Three City Study that are comparable to thresholds from the HFSSM. We also use the Three City Study to empirically investigate the determinants of food insecurity and of these specific food insecurity outcomes, estimating a multivariate behavioral Rasch model that is adapted to address longitudinal data. The estimation results indicate that participation in the Supplemental Nutrition Assistance Program and the Temporary Assistance for Needy Families program reduce food insecurity, while poverty and disability among caregivers increase it. Besides its longitudinal structure, the Three City Study measures many more characteristics about households than the CPS. Our estimates reveal that financial assistance through social networks and a household's own financial assets reduce food insecurity, while its outstanding loans increase insecurity.

  10. Rasch Analyses of Very Low Food Security among Households and Children in the Three City Study*

    Science.gov (United States)

    Moffitt, Robert A.; Ribar, David C.

    2017-01-01

    The longitudinal Three City Study of low-income families with children measures food hardships using fewer questions and some different questions from the standard U.S. instrument for measuring food security, the Household Food Security Survey Module (HFSSM) in the Current Population Survey (CPS). We utilize a Rasch measurement model to identify thresholds of very low food security among households and very low food security among children in the Three City Study that are comparable to thresholds from the HFSSM. We also use the Three City Study to empirically investigate the determinants of food insecurity and of these specific food insecurity outcomes, estimating a multivariate behavioral Rasch model that is adapted to address longitudinal data. The estimation results indicate that participation in the Supplemental Nutrition Assistance Program and the Temporary Assistance for Needy Families program reduce food insecurity, while poverty and disability among caregivers increase it. Besides its longitudinal structure, the Three City Study measures many more characteristics about households than the CPS. Our estimates reveal that financial assistance through social networks and a household's own financial assets reduce food insecurity, while its outstanding loans increase insecurity. PMID:29187764

  11. Meta-analyses of genome-wide association studies identify multiple loci associated with pulmonary function

    NARCIS (Netherlands)

    D.B. Hancock (Dana); M. Eijgelsheim (Mark); J.B. Wilk (Jemma); S.A. Gharib (Sina); L.R. Loehr (Laura); K. Marciante (Kristin); N. Franceschini (Nora); Y.M.T.A. van Durme; T.H. Chen; R.G. Barr (Graham); M.B. Schabath (Matthew); D.J. Couper (David); G.G. Brusselle (Guy); B.M. Psaty (Bruce); P. Tikka-Kleemola (Päivi); J.I. Rotter (Jerome); A.G. Uitterlinden (André); A. Hofman (Albert); N.M. Punjabi (Naresh); F. Rivadeneira Ramirez (Fernando); A.C. Morrison (Alanna); P.L. Enright (Paul); K.E. North (Kari); S.R. Heckbert (Susan); T. Lumley (Thomas); B.H.Ch. Stricker (Bruno); G.T. O'Connor (George); S.J. London (Stephanie)

    2010-01-01

    textabstractSpirometric measures of lung function are heritable traits that reflect respiratory health and predict morbidity and mortality. We meta-analyzed genome-wide association studies for two clinically important lung-function measures: forced expiratory volume in the first second (FEV1) and

  12. Psychiatric genome-wide association study analyses implicate neuronal, immune and histone pathways

    NARCIS (Netherlands)

    O'Dushlaine, Colm; Rossin, Lizzy; Lee, Phil H.; Duncan, Laramie; Parikshak, Neelroop N.; Newhouse, Stephen; Ripke, Stephan; Neale, Benjamin M.; Purcell, Shaun M.; Posthuma, Danielle; Nurnberger, John I.; Lee, S. Hong; Faraone, Stephen V.; Perlis, Roy H.; Mowry, Bryan J.; Thapar, Anita; Goddard, Michael E.; Witte, John S.; Absher, Devin; Agartz, Ingrid; Akil, Huda; Amin, Farooq; Andreassen, Ole A.; Anjorin, Adebayo; Anney, Richard; Anttila, Verneri; Arking, Dan E.; Asherson, Philip; Azevedo, Maria H.; Backlund, Lena; Badner, Judith A.; Bailey, Anthony J.; Banaschewski, Tobias; Barchas, Jack D.; Barnes, Michael R.; Barrett, Thomas B.; Bass, Nicholas; Battaglia, Agatino; Bauer, Michael; Bayés, Mònica; Bellivier, Frank; Bergen, Sarah E.; Berrettini, Wade; Betancur, Catalina; Bettecken, Thomas; Biederman, Joseph; Binder, Elisabeth B.; Black, Donald W.; de Haan, Lieuwe; Linszen, Don H.

    2015-01-01

    Genome-wide association studies (GWAS) of psychiatric disorders have identified multiple genetic associations with such disorders, but better methods are needed to derive the underlying biological mechanisms that these signals indicate. We sought to identify biological pathways in GWAS data from

  13. The shaping of environmental concern in product chains: analysing Danish case studies on environmental aspects in product chain relations

    DEFF Research Database (Denmark)

    Forman, Marianne; Hansen, Anne Grethe; Jørgensen, Michael Søgaard

    indirect demand for greening activities. The analysis shows the co-construction of environmental concerns and demands, companies’ environmental practices and technological developments, and their stabilisation in the supply chain. The case studies also point to how the greening of frontrunners might make...... the systems of production, consumption, knowledge and regulation are discussed. The role of boundary objects is discussed with eco-labelling as case. The role of and the impact on the product chain relations are analysed as part of these mechanisms. From the case studies, green innovations in the product...... chain, which the case company represents, are identified. Direct customer and regulatory demands, as well as indirect societal and regulatory demands are mapped, and their role for product chain greening analysed. The case studies point to the importance of customer demand, regulation and potentially...

  14. Gr/gr deletions on Y-chromosome correlate with male infertility: an original study, meta-analyses, and trial sequential analyses

    Science.gov (United States)

    Bansal, Sandeep Kumar; Jaiswal, Deepika; Gupta, Nishi; Singh, Kiran; Dada, Rima; Sankhwar, Satya Narayan; Gupta, Gopal; Rajender, Singh

    2016-02-01

    We analyzed the AZFc region of the Y-chromosome for complete (b2/b4) and distinct partial deletions (gr/gr, b1/b3, b2/b3) in 822 infertile and 225 proven fertile men. We observed complete AZFc deletions in 0.97% and partial deletions in 6.20% of the cases. Among partial deletions, the frequency of gr/gr deletions was the highest (5.84%). The comparison of partial deletion data between cases and controls suggested a significant association of the gr/gr deletions with infertility (P = 0.0004); however, the other partial deletions did not correlate with infertility. In cohort analysis, men with gr/gr deletions had a relatively poor sperm count (54.20 ± 57.45 million/ml) in comparison to those without deletions (72.49 ± 60.06), though the difference was not statistically significant (p = 0.071). Meta-analysis also suggested that gr/gr deletions are significantly associated with male infertility risk (OR = 1.821, 95% CI = 1.39-2.37, p = 0.000). We also performed trial sequential analyses that strengthened the evidence for an overall significant association of gr/gr deletions with the risk of male infertility. Another meta-analysis suggested a significant association of the gr/gr deletions with low sperm count. In conclusion, the gr/gr deletions show a strong correlation with male infertility risk and low sperm count, particularly in the Caucasian populations.

  15. Case Study Analyses of the Impact of Flipped Learning in Teaching Programming Robots

    Directory of Open Access Journals (Sweden)

    Majlinda Fetaji

    2016-11-01

    Full Text Available The focus of the research study was to investigate and find out the benefits of the flipped learning pedagogy on the student learning in teaching programming Robotics classes. Also, the assessment of whether it has any advantages over the traditional teaching methods in computer sciences. Assessment of learners on their attitudes, motivation, and effectiveness when using flipped classroom compared with traditional classroom has been realized. The research questions investigated are: “What kind of problems can we face when we have robotics classes in the traditional methods?”, “If we applied flipped learning method, can we solve these problems?”. In order to analyze all this, a case study experiment was realized and insights as well as recommendations are presented.

  16. Comparing direct image and wavelet transform-based approaches to analysing remote sensing imagery for predicting wildlife distribution

    NARCIS (Netherlands)

    Murwira, A.; Skidmore, A.K.

    2010-01-01

    In this study we tested the ability to predict the probability of elephant (Loxodonta africana) presence in an agricultural landscape of Zimbabwe based on three methods of measuring the spatial heterogeneity in vegetation cover, where vegetation cover was measured using the Landsat Thematic Mapper

  17. Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.

    Science.gov (United States)

    Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi

    2013-12-01

    Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.

  18. Lessons for public health campaigns from analysing commercial food marketing success factors: a case study

    OpenAIRE

    Aschemann-Witzel, Jessica; Perez-Cueto, Federico JA; Niedzwiedzka, Barbara; Verbeke, Wim; Bech-Larsen, Tino

    2012-01-01

    Abstract Background Commercial food marketing has considerably shaped consumer food choice behaviour. Meanwhile, public health campaigns for healthier eating have had limited impact to date. Social marketing suggests that successful commercial food marketing campaigns can provide useful lessons for public sector activities. The aim of the present study was to empirically identify food marketing success factors that, using the social marketing approach, could help improve public health campaig...

  19. Detection of T790M, the acquired resistance EGFR mutation, by tumor biopsy versus noninvasive blood-based analyses

    Science.gov (United States)

    Sundaresan, Tilak K.; Sequist, Lecia V.; Heymach, John V.; Riely, Gregory J.; Jänne, Pasi A.; Koch, Walter H.; Sullivan, James P.; Fox, Douglas B.; Maher, Robert; Muzikansky, Alona; Webb, Andrew; Tran, Hai T.; Giri, Uma; Fleisher, Martin; Yu, Helena A.; Wei, Wen; Johnson, Bruce E.; Barber, Thomas A.; Walsh, John R.; Engelman, Jeffrey A.; Stott, Shannon L.; Kapur, Ravi; Maheswaran, Shyamala; Toner, Mehmet

    2015-01-01

    Purpose The T790M gatekeeper mutation in the Epidermal Growth Factor Receptor (EGFR) is acquired by some EGFR-mutant non-small cell lung cancers (NSCLC) as they become resistant to selective tyrosine kinase inhibitors (TKIs). As third generation EGFR TKIs that overcome T790M-associated resistance become available, noninvasive approaches to T790M detection will become critical to guide management. Experimental Design As part of a multi-institutional Stand-Up-To-Cancer collaboration, we performed an exploratory analysis of 40 patients with EGFR-mutant tumors progressing on EGFR TKI therapy. We compared the T790M genotype from tumor biopsies with analysis of simultaneously collected circulating tumor cells (CTC) and circulating tumor DNA (ctDNA). Results T790M genotypes were successfully obtained in 30 (75%) tumor biopsies, 28 (70%) CTC samples and 32 (80%) ctDNA samples. The resistance-associated mutation was detected in 47–50% of patients using each of the genotyping assays, with concordance among them ranging from 57–74%. While CTC- and ctDNA-based genotyping were each unsuccessful in 20–30% of cases, the two assays together enabled genotyping in all patients with an available blood sample, and they identified the T790M mutation in 14 (35%) patients in whom the concurrent biopsy was negative or indeterminate. Conclusion Discordant genotypes between tumor biopsy and blood-based analyses may result from technological differences, as well as sampling different tumor cell populations. The use of complementary approaches may provide the most complete assessment of each patient’s cancer, which should be validated in predicting response to T790M-targeted inhibitors. PMID:26446944

  20. In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy

    Data.gov (United States)

    U.S. Environmental Protection Agency — In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy showing spectral fitting and linear...

  1. Clinical Presentation and Microbial Analyses of Contact Lens Keratitis; an Epidemiologic Study

    Directory of Open Access Journals (Sweden)

    Seyed Ahmad Rasoulinejad

    2014-09-01

    Full Text Available Introduction: Microbial keratitis is an infective process of the cornea with a potentially and serious visual impairments. Contact lenses are a major cause of microbial keratitis in the developed countries especially among young people. Therefore, the purpose of the present study was to evaluate the frequency and microbiological characteristic of CLK in patients referred to the emergency department (ED of teaching hospitals, Babol, Iran. Methods: This is a cross-sectional study of all patients with contact lens induced corneal ulcers admitted to the teaching hospitals of Babol, Iran, from 2011- 2013. An ophthalmologist examined patients with the slit-lamp and clinical features of them were noted (including pain, redness, foreign body sensation, chemosis, epiphora, blurred vision, discomfort, photophobia, discharge, ocular redness and swelling. All suspected infectious corneal ulcers were scraped for microbial culture and two slides were prepared. Data were analyzed using SPSS software, version 18.0. Results: A total of 14 patients (17 eyes were recruited into the study (100% female. The patients’ age ranged from 16-37 years old (mean age 21.58±7.23 years. The most prevalent observed clinical signs were pain and redness. Three samples reported as sterile. The most common isolated causative organism was pseudomonas aeroginosa (78.6%, Staphylococcus aureus 14.3%, and enterobacter 7.1%, respectively. Treatment outcome was excellent in 23.5%, good in 47.1%, and poor in 29.4% of cases. Conclusion: Improper lens wear and care as well as the lack of awareness about the importance of aftercare visits have been identified as potential risk factors for the corneal ulcer among contact lens wearers. Training and increasing the awareness of adequate lens care and disinfection practices, consulting with an ophthalmologist, and frequent replacement of contact lens storage cases would greatly help reducing the risk of microbial keratitis.

  2. Analyses of the performance of the ASTRID-like TRU burners in regional scenario studies - 5136

    International Nuclear Information System (INIS)

    Vezzoni, B.; Gabrielli, F.; Rineiski, A.

    2015-01-01

    In the past, large Sodium Fast Reactors systems (earlier CAPRA/CADRA, later ESFR and ESFR-like systems) and Accelerator Driven Systems (ADS-EFIT) were considered and extensively studied in Europe for managing MAs/Pu within regional or national scenario studies. After the ASTRID system was proposed in France, ASTRID-like burners could be considered as further options to be investigated. Low conversion ratio (CR) ASTRID-like burner cores (1200 MWth) have been considered at KIT by introducing few modifications with respect to the original French ASTRID design. These modifications allow keeping almost unchanged the main characteristics of the system (e.g. thermal power) and avoiding a strong deterioration of safety parameters (such as sodium void effect) after introduction of large amounts of Pu (more than 20%) and MAs (2-12%) in the fuel. These cores have already been studied at KIT for phase-out scenarios. A constant energy production case, relevant for a European or another regional scenario is considered in the paper. Cases with different shares (from 10 to 30%) of ASTRID-like burners in the nuclear energy fleet are compared. The results show that the ASTRID-like burners allow the use of all TRUs compositions foreseen in the fuel cycle with a proper choice of the MAs to Pu ratios and of the U/TRUs fractions either in phasing-out and on-going nuclear energy utilization conditions. The results show that a mixed fleet composed of 11% burners and 89% ESFR is able to stabilize the MAs in the cycle. The same stabilization is obtained with a fleet composed by 33% burner in combination with LWRs only

  3. Absorption of selected radionuclides. Analysis of a literature study. Resorption ausgewaehlter Radionuklide. Analyse einer Literaturstudie

    Energy Technology Data Exchange (ETDEWEB)

    Roedler, H D; Kraus, H M

    1979-12-01

    In October 1978, the Institut fuer Energie- und Umweltforschung Heidelberg e.V. published a contribution to part 26 of the model study of radio-ecology at Biblis under the title 'Estimation of the absorption of radionuclides from the gastrointestinal tract in the blood'. Using the example of this contribution, a critical analysis is made to show how a selection of the information contained in various scientific publications and other items of literature can give uncritical readers the impression that all statements made are scientifically well founded.

  4. Lessons for public health campaigns from analysing commercial food marketing success factors: a case study

    Science.gov (United States)

    2012-01-01

    Background Commercial food marketing has considerably shaped consumer food choice behaviour. Meanwhile, public health campaigns for healthier eating have had limited impact to date. Social marketing suggests that successful commercial food marketing campaigns can provide useful lessons for public sector activities. The aim of the present study was to empirically identify food marketing success factors that, using the social marketing approach, could help improve public health campaigns to promote healthy eating. Methods In this case-study analysis, 27 recent and successful commercial food and beverage marketing cases were purposively sampled from different European countries. The cases involved different consumer target groups, product categories, company sizes and marketing techniques. The analysis focused on cases of relatively healthy food types, and nutrition and health-related aspects in the communication related to the food. Visual as well as written material was gathered, complemented by semi-structured interviews with 12 food market trend experts and 19 representatives of food companies and advertising agencies. Success factors were identified by a group of experts who reached consensus through discussion structured by a card sorting method. Results Six clusters of success factors emerged from the analysis and were labelled as "data and knowledge", "emotions", "endorsement", "media", "community" and "why and how". Each cluster subsumes two or three success factors and is illustrated by examples. In total, 16 factors were identified. It is argued that the factors "nutritional evidence", "trend awareness", "vertical endorsement", "simple naturalness" and "common values" are of particular importance in the communication of health with regard to food. Conclusions The present study identified critical factors for the success of commercial food marketing campaigns related to the issue of nutrition and health, which are possibly transferable to the public health

  5. Lessons for public health campaigns from analysing commercial food marketing success factors: a case study.

    Science.gov (United States)

    Aschemann-Witzel, Jessica; Perez-Cueto, Federico J A; Niedzwiedzka, Barbara; Verbeke, Wim; Bech-Larsen, Tino

    2012-02-21

    Commercial food marketing has considerably shaped consumer food choice behaviour. Meanwhile, public health campaigns for healthier eating have had limited impact to date. Social marketing suggests that successful commercial food marketing campaigns can provide useful lessons for public sector activities. The aim of the present study was to empirically identify food marketing success factors that, using the social marketing approach, could help improve public health campaigns to promote healthy eating. In this case-study analysis, 27 recent and successful commercial food and beverage marketing cases were purposively sampled from different European countries. The cases involved different consumer target groups, product categories, company sizes and marketing techniques. The analysis focused on cases of relatively healthy food types, and nutrition and health-related aspects in the communication related to the food. Visual as well as written material was gathered, complemented by semi-structured interviews with 12 food market trend experts and 19 representatives of food companies and advertising agencies. Success factors were identified by a group of experts who reached consensus through discussion structured by a card sorting method. Six clusters of success factors emerged from the analysis and were labelled as "data and knowledge", "emotions", "endorsement", "media", "community" and "why and how". Each cluster subsumes two or three success factors and is illustrated by examples. In total, 16 factors were identified. It is argued that the factors "nutritional evidence", "trend awareness", "vertical endorsement", "simple naturalness" and "common values" are of particular importance in the communication of health with regard to food. The present study identified critical factors for the success of commercial food marketing campaigns related to the issue of nutrition and health, which are possibly transferable to the public health sector. Whether or not a particular

  6. Study of parameters important to soil-structure interaction in seismic analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Nelson, T.A.

    1983-12-01

    The development of state-of-the-art techniques for analyzing the effects of soil-structure interaction (SSI) on structures during earthquakes is outlined. Emphasis is placed on methods to account for energy dissipation as a result of both wave propagation away from the structure's foundation and hysteretic soil response. Solution techniques are grouped into two major types: substructure methods, which break the problem into a series of steps; and direct methods, which analyze the soil-structure model in one step. In addition to theoretical and historical development of SSI methodology, case studies are presented illustrating the application of these solution techniques. 94 references

  7. Wind Energy Applications for Municipal Water Services: Opportunities, Situation Analyses, and Case Studies; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Flowers, L.; Miner-Nordstrom, L.

    2006-01-01

    As communities grow, greater demands are placed on water supplies, wastewater services, and the electricity needed to power the growing water services infrastructure. Water is also a critical resource for thermoelectric power plants. Future population growth in the United States is therefore expected to heighten competition for water resources. Many parts of the United States with increasing water stresses also have significant wind energy resources. Wind power is the fastest-growing electric generation source in the United States and is decreasing in cost to be competitive with thermoelectric generation. Wind energy can offer communities in water-stressed areas the option of economically meeting increasing energy needs without increasing demands on valuable water resources. Wind energy can also provide targeted energy production to serve critical local water-system needs. The research presented in this report describes a systematic assessment of the potential for wind power to support water utility operation, with the objective to identify promising technical applications and water utility case study opportunities. The first section describes the current situation that municipal providers face with respect to energy and water. The second section describes the progress that wind technologies have made in recent years to become a cost-effective electricity source. The third section describes the analysis employed to assess potential for wind power in support of water service providers, as well as two case studies. The report concludes with results and recommendations.

  8. Adjusted Analyses in Studies Addressing Therapy and Harm: Users' Guides to the Medical Literature.

    Science.gov (United States)

    Agoritsas, Thomas; Merglen, Arnaud; Shah, Nilay D; O'Donnell, Martin; Guyatt, Gordon H

    2017-02-21

    Observational studies almost always have bias because prognostic factors are unequally distributed between patients exposed or not exposed to an intervention. The standard approach to dealing with this problem is adjusted or stratified analysis. Its principle is to use measurement of risk factors to create prognostically homogeneous groups and to combine effect estimates across groups.The purpose of this Users' Guide is to introduce readers to fundamental concepts underlying adjustment as a way of dealing with prognostic imbalance and to the basic principles and relative trustworthiness of various adjustment strategies.One alternative to the standard approach is propensity analysis, in which groups are matched according to the likelihood of membership in exposed or unexposed groups. Propensity methods can deal with multiple prognostic factors, even if there are relatively few patients having outcome events. However, propensity methods do not address other limitations of traditional adjustment: investigators may not have measured all relevant prognostic factors (or not accurately), and unknown factors may bias the results.A second approach, instrumental variable analysis, relies on identifying a variable associated with the likelihood of receiving the intervention but not associated with any prognostic factor or with the outcome (other than through the intervention); this could mimic randomization. However, as with assumptions of other adjustment approaches, it is never certain if an instrumental variable analysis eliminates bias.Although all these approaches can reduce the risk of bias in observational studies, none replace the balance of both known and unknown prognostic factors offered by randomization.

  9. Optimization study on structural analyses for the J-PARC mercury target vessel

    Science.gov (United States)

    Guan, Wenhai; Wakai, Eiichi; Naoe, Takashi; Kogawa, Hiroyuki; Wakui, Takashi; Haga, Katsuhiro; Takada, Hiroshi; Futakawa, Masatoshi

    2018-06-01

    The spallation neutron source at the Japan Proton Accelerator Research Complex (J-PARC) mercury target vessel is used for various materials science studies, work is underway to achieve stable operation at 1 MW. This is very important for enhancing the structural integrity and durability of the target vessel, which is being developed for 1 MW operation. In the present study, to reduce thermal stress and relax stress concentrations more effectively in the existing target vessel in J-PARC, an optimization approach called the Taguchi method (TM) is applied to thermo-mechanical analysis. The ribs and their relative parameters, as well as the thickness of the mercury vessel and shrouds, were selected as important design parameters for this investigation. According to the analytical results of 18 model types designed using the TM, the optimal design was determined. It is characterized by discrete ribs and a thicker vessel wall than the current design. The maximum thermal stresses in the mercury vessel and the outer shroud were reduced by 14% and 15%, respectively. Furthermore, it was indicated that variations in rib width, left/right rib intervals, and shroud thickness could influence the maximum thermal stress performance. It is therefore concluded that the TM was useful for optimizing the structure of the target vessel and to reduce the thermal stress in a small number of calculation cases.

  10. PCR-RFLP analyses for studying the diversity of GH and Pit-1 genes in Slovak Simmental cattle

    Directory of Open Access Journals (Sweden)

    Anna Trakovická

    2013-10-01

    Full Text Available The aim of this study was evaluation of growth hormone (GH and specific pituitary transcription factor (Pit-1 genes diversity in population of 353 Slovak Simmental cows. The analyses were based on single nucleotide polymorphisms GH/AluI and Pit-1/HinfI detections. A polymorphic site of GH gene (AluI has been linked to differences in circulating metabolites, metabolic hormones and milk yield. Bovine Pit-1 is responsible for pituitary development and hormone secreting gene expression, including GH gene. The Pit-1/HinfI locus was associated with growth, milk production and reproduction performance in cattle. Samples of genomic DNA were analyzed by PCR-RFLP method. Digestion of GH gene PCR products with restriction enzyme AluI revealed allele L and V with frequency 0.695 and 0.305, respectively. The digested Pit-1 gene PCR products with enzyme HinfI revealed alleles A (0.249 and B (0.751. Dominant genotypes were for GH gene heterozygous LV (0.47 and for Pit-1 gene homozygous BB (0.56 animals. The observed heterozygosity, effective allele numbers and polymorphism information content of GH/AluI and Pit-1/HinfI bovine loci population were 0.42/0.37, 1.73/1.59 and 0.33/0.30, respectively. The median polymorphic information content of loci was also transferred to the higher observed homozygosity in population (0.58/0.63. Keywords: cattle, growth hormone, leptin, PCR, Pit-1, polymorphism.

  11. Thermogravimetric analyses and mineralogical study of polymer modified mortar with silica fume

    Directory of Open Access Journals (Sweden)

    Alessandra Etuko Feuzicana de Souza Almeida

    2006-09-01

    Full Text Available Mineral and organic additions are often used in mortars to improve their properties. Microstructural investigation concerning the effects of styrene acrylic polymer and silica fume on the mineralogical composition of high-early-strength portland cement pastes after 28 days of hydration are presented in this paper. Thermogravimetry and derivative thermogravimetry were used to study the interaction between polymers and cement, as well as the extent of pozzolanic reaction of the mortars with silica fume. Differential scanning calorimetry and X ray diffraction were used to investigate the cement hydration and the effect of the additions. The results showed that the addition of silica fume and polymer reduces the portlandite formation due to delaying of Portland cement hydration and pozzolanic reaction.

  12. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  13. Multi-path transportation futures study : vehicle characterization and scenario analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Plotkin, S. E.; Singh, M. K.; Energy Systems; TA Engineering; ORNL

    2009-12-03

    Projecting the future role of advanced drivetrains and fuels in the light vehicle market is inherently difficult, given the uncertainty (and likely volatility) of future oil prices, inadequate understanding of likely consumer response to new technologies, the relative infancy of several important new technologies with inevitable future changes in their performance and costs, and the importance - and uncertainty - of future government marketplace interventions (e.g., new regulatory standards or vehicle purchase incentives). This Multi-Path Transportation Futures (MP) Study has attempted to improve our understanding of this future role by examining several scenarios of vehicle costs, fuel prices, government subsidies, and other key factors. These are projections, not forecasts, in that they try to answer a series of 'what if' questions without assigning probabilities to most of the basic assumptions.

  14. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    Science.gov (United States)

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  15. XRF analyses for the study of painting technique and degradation on frescoes by Beato Angelico: first results

    International Nuclear Information System (INIS)

    Mazzinghi, A.

    2014-01-01

    Beato Angelico is one of the most important Italian painters of the Renaissance period, in particular he was a master of the so-called 'Buon fresco' technique for mural paintings. A wide diagnostic campaign with X-Ray Fluorescence (XRF) analyses has been carried out on three masterworks painted by Beato Angelico in the San Marco monastery in Florence: the Crocifissione con Santi, the 'Annunciazione' and the 'Madonna delle Ombre'. The latter is painted by mixing fresco and secco techniques, which makes it of particular interest for the study of two different paintings techniques of the same artist. Then the aim of the study was focused on the characterization of the painting palette, and therefore the painting techniques, used by Beato Angelico. Moreover, the conservators were interested in the study of degradation processes and old restoration treatments. Our analyses have been carried out by means of the XRF spectrometer developed at LABEC laboratory at Istituto Nazionale di Fisica Nucleare in Florence (Italy). XRF is indeed especially suited for such a kind of study, allowing for multi-elemental, nondestructive, non-invasive analyses in a short time, with portable instruments. In this paper the first results concerning the XRF analysis are presented.