WorldWideScience

Sample records for base study analyses

  1. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  2. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  3. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    Directory of Open Access Journals (Sweden)

    Han Bossier

    2018-01-01

    Full Text Available Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1 the balance between false and true positives and (2 the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS, or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35. To do this, we apply a resampling scheme on a large dataset (N = 1,400 to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results.

  4. The effect of English-language restriction on systematic review-based meta-analyses: a systematic review of empirical studies.

    Science.gov (United States)

    Morrison, Andra; Polisena, Julie; Husereau, Don; Moulton, Kristen; Clark, Michelle; Fiander, Michelle; Mierzwinski-Urban, Monika; Clifford, Tammy; Hutton, Brian; Rabb, Danielle

    2012-04-01

    The English language is generally perceived to be the universal language of science. However, the exclusive reliance on English-language studies may not represent all of the evidence. Excluding languages other than English (LOE) may introduce a language bias and lead to erroneous conclusions. We conducted a comprehensive literature search using bibliographic databases and grey literature sources. Studies were eligible for inclusion if they measured the effect of excluding randomized controlled trials (RCTs) reported in LOE from systematic review-based meta-analyses (SR/MA) for one or more outcomes. None of the included studies found major differences between summary treatment effects in English-language restricted meta-analyses and LOE-inclusive meta-analyses. Findings differed about the methodological and reporting quality of trials reported in LOE. The precision of pooled estimates improved with the inclusion of LOE trials. Overall, we found no evidence of a systematic bias from the use of language restrictions in systematic review-based meta-analyses in conventional medicine. Further research is needed to determine the impact of language restriction on systematic reviews in particular fields of medicine.

  5. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  6. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  7. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  8. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  9. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  10. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  11. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    Science.gov (United States)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-10-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  12. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  13. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies

    DEFF Research Database (Denmark)

    Gyöngyösi, Mariann; Wojakowski, Wojciech; Navarese, Eliano P

    2016-01-01

    In contrast to multiple publication-based meta-analyses involving clinical cardiac regeneration therapy in patients with recent myocardial infarction, a recently published meta-analysis based on individual patient data reported no effect of cell therapy on left ventricular function or clinical...

  14. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  15. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    Science.gov (United States)

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  16. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  17. Register-based studies of healthcare costs

    DEFF Research Database (Denmark)

    Kruse, Marie; Christiansen, Terkel

    2011-01-01

    Introduction: The aim of this paper is to provide an overview and a few examples of how national registers are used in analyses of healthcare costs in Denmark. Research topics: The paper focuses on health economic analyses based on register data. For the sake of simplicity, the studies are divided...... into three main categories: economic evaluations of healthcare interventions, cost-of-illness analyses, and other analyses such as assessments of healthcare productivity. Conclusion: We examined a number of studies using register-based data on healthcare costs. Use of register-based data renders...

  18. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  19. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  20. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  1. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  2. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  3. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  4. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  5. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms.

    Science.gov (United States)

    Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T

    2016-05-23

    Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.

  6. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  7. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  8. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  9. How distributed processing produces false negatives in voxel-based lesion-deficit analyses.

    Science.gov (United States)

    Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J

    2018-07-01

    In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients

  10. Orbitrap-based mass analyser for in-situ characterization of asteroids: ILMA, Ion Laser Mass Analyser

    Science.gov (United States)

    Briois, C.; Cotti, H.; Thirkell, L.; Space Orbitrap Consortium[K. Aradj, French; Bouabdellah, A.; Boukrara, A.; Carrasco, N.; Chalumeau, G.; Chapelon, O.; Colin, F.; Coll, P.; Engrand, C.; Grand, N.; Kukui, A.; Lebreton, J.-P.; Pennanech, C.; Szopa, C.; Thissen, R.; Vuitton, V.; Zapf], P.; Makarov, A.

    2014-07-01

    Since about a decade the boundaries between comets and carbonaceous asteroids are fading [1,2]. No doubt that the Rosetta mission should bring a new wealth of data on the composition of comets. But as promising as it may look, the mass resolving power of the mass spectrometers onboard (so far the best on a space mission) will only be able to partially account for the diversity of chemical structures present. ILMA (Ion-Laser Mass Analyser) is a new generation high mass resolution LDI-MS (Laser Desorption-Ionization Mass Spectrometer) instrument concept using the Orbitrap technique, which has been developed in the frame of the two Marco Polo & Marco Polo-R proposals to the ESA Cosmic Vision program. Flagged by ESA as an instrument concept of interest for the mission in 2012, it has been under study for a few years in the frame of a Research and Technology (R&T) development programme between 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) [3,4], partly funded by the French Space Agency (CNES). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercialises Orbitrap-based laboratory instruments. The R&T activities are currently concentrating on the core elements of the Orbitrap analyser that are required to reach a sufficient maturity level for allowing design studies of future space instruments. A prototype is under development at LPC2E and a mass resolution (m/Δm FWHM) of 100,000 as been obtained at m/z = 150 for a background pressure of 10^{-8} mbar. ILMA would be a key instrument to measure the molecular, elemental and isotopic composition of objects such as carbonaceous asteroids, comets, or other bodies devoid of atmosphere such as the surface of an icy satellite, the Moon, or Mercury.

  11. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  12. CrusView: a Java-based visualization platform for comparative genomics analyses in Brassicaceae species.

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-09-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/.

  13. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  14. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  15. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  17. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  18. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  19. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-01-01

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  20. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  1. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  2. Seismic risk analyses in the German Risk Study, phase B

    International Nuclear Information System (INIS)

    Hosser, D.; Liemersdorf, H.

    1991-01-01

    The paper discusses some aspects of the seismic risk part of the German Risk Study for Nuclear Power Plants, Phase B. First simplified analyses in Phase A of the study allowed only a rough classification of structures and systems of the PWR reference plant according to their seismic risk contribution. These studies were extended in Phase B using improved models for the dynamic analyses of buildings, structures and components as well as for the probabilistic analyses of seismic loading, failure probabilities and event trees. The methodology of deriving probabilistic seismic load descriptions is explained and compared with the methods in Phase A of the study and in other studies. Some details of the linear and nonlinear dynamic analyses of structures are reported in order to demonstrate the influence of different assumptions for material behaviour and failure criteria. The probabilistic structural and event tree analyses are discussed with respect to distribution assumptions, acceptable simplifications and model uncertainties. Some results for the PWR reference plant are given. (orig.)

  3. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  4. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  5. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  6. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  7. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  8. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  9. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  10. CrusView: A Java-Based Visualization Platform for Comparative Genomics Analyses in Brassicaceae Species[OPEN

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-01-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/. PMID:23898041

  11. Devising a New Model of Demand-Based Learning Integrated with Social Networks and Analyses of its Performance

    Directory of Open Access Journals (Sweden)

    Bekim Fetaji

    2018-02-01

    Full Text Available The focus of the research study is to devise a new model for demand based learning that will be integrated with social networks such as Facebook, twitter and other. The study investigates this by reviewing the published literature and realizes a case study analyses in order to analyze the new models’ analytical perspectives of practical implementation. The study focuses on analyzing demand-based learning and investigating how it can be improved by devising a specific model that incorporates social network use. Statistical analyses of the results of the questionnaire through research of the raised questions and hypothesis showed that there is a need for introducing new models in the teaching process. The originality stands on the prologue of the social login approach to an educational environment, whereas the approach is counted as a contribution of developing a demand-based web application, which aims to modernize the educational pattern of communication, introduce the social login approach, and increase the process of knowledge transfer as well as improve learners’ performance and skills. Insights and recommendations are provided, argumented and discussed.

  12. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A; Leppaemaeki, E; Koponen, P; Levander, J; Tapola, E [VTT Energy, Espoo (Finland). Energy Production Technologies

    1998-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  13. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  14. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  15. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  16. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  17. A Systematic Review of Cardiovascular Outcomes-Based Cost-Effectiveness Analyses of Lipid-Lowering Therapies.

    Science.gov (United States)

    Wei, Ching-Yun; Quek, Ruben G W; Villa, Guillermo; Gandra, Shravanthi R; Forbes, Carol A; Ryder, Steve; Armstrong, Nigel; Deshpande, Sohan; Duffy, Steven; Kleijnen, Jos; Lindgren, Peter

    2017-03-01

    Previous reviews have evaluated economic analyses of lipid-lowering therapies using lipid levels as surrogate markers for cardiovascular disease. However, drug approval and health technology assessment agencies have stressed that surrogates should only be used in the absence of clinical endpoints. The aim of this systematic review was to identify and summarise the methodologies, weaknesses and strengths of economic models based on atherosclerotic cardiovascular disease event rates. Cost-effectiveness evaluations of lipid-lowering therapies using cardiovascular event rates in adults with hyperlipidaemia were sought in Medline, Embase, Medline In-Process, PubMed and NHS EED and conference proceedings. Search results were independently screened, extracted and quality checked by two reviewers. Searches until February 2016 retrieved 3443 records, from which 26 studies (29 publications) were selected. Twenty-two studies evaluated secondary prevention (four also assessed primary prevention), two considered only primary prevention and two included mixed primary and secondary prevention populations. Most studies (18) based treatment-effect estimates on single trials, although more recent evaluations deployed meta-analyses (5/10 over the last 10 years). Markov models (14 studies) were most commonly used and only one study employed discrete event simulation. Models varied particularly in terms of health states and treatment-effect duration. No studies used a systematic review to obtain utilities. Most studies took a healthcare perspective (21/26) and sourced resource use from key trials instead of local data. Overall, reporting quality was suboptimal. This review reveals methodological changes over time, but reporting weaknesses remain, particularly with respect to transparency of model reporting.

  18. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    Science.gov (United States)

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  19. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. © The Author(s) 2016.

  20. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-01-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  1. Limitations of Species Delimitation Based on Phylogenetic Analyses: A Case Study in the Hypogymnia hypotrypa Group (Parmeliaceae, Ascomycota.

    Directory of Open Access Journals (Sweden)

    Xinli Wei

    Full Text Available Delimiting species boundaries among closely related lineages often requires a range of independent data sets and analytical approaches. Similar to other organismal groups, robust species circumscriptions in fungi are increasingly investigated within an empirical framework. Here we attempt to delimit species boundaries in a closely related clade of lichen-forming fungi endemic to Asia, the Hypogymnia hypotrypa group (Parmeliaceae. In the current classification, the Hypogymnia hypotrypa group includes two species: H. hypotrypa and H. flavida, which are separated based on distinctive reproductive modes, the former producing soredia but absent in the latter. We reexamined the relationship between these two species using phenotypic characters and molecular sequence data (ITS, GPD, and MCM7 sequences to address species boundaries in this group. In addition to morphological investigations, we used Bayesian clustering to identify potential genetic groups in the H. hypotrypa/H. flavida clade. We also used a variety of empirical, sequence-based species delimitation approaches, including: the "Automatic Barcode Gap Discovery" (ABGD, the Poisson tree process model (PTP, the General Mixed Yule Coalescent (GMYC, and the multispecies coalescent approach BPP. Different species delimitation scenarios were compared using Bayes factors delimitation analysis, in addition to comparisons of pairwise genetic distances, pairwise fixation indices (FST. The majority of the species delimitation analyses implemented in this study failed to support H. hypotrypa and H. flavida as distinct lineages, as did the Bayesian clustering analysis. However, strong support for the evolutionary independence of H. hypotrypa and H. flavida was inferred using BPP and further supported by Bayes factor delimitation. In spite of rigorous morphological comparisons and a wide range of sequence-based approaches to delimit species, species boundaries in the H. hypotrypa group remain uncertain

  2. Individual-based analyses reveal limited functional overlap in a coral reef fish community.

    Science.gov (United States)

    Brandl, Simon J; Bellwood, David R

    2014-05-01

    Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on

  3. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  4. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  5. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  6. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  7. Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study

    NARCIS (Netherlands)

    Vale, C.L.; Rydzewska, L.H.; Rovers, M.M.; Emberson, J.R.; Gueyffier, F.; Stewart, L.A.

    2015-01-01

    OBJECTIVE: To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines. DESIGN: Descriptive study. SETTING: Database maintained by the Cochrane IPD Meta-analysis

  8. 14C-analyses of calcite coatings in open fractures from the Klipperaas study site, Southern Sweden

    International Nuclear Information System (INIS)

    Possnert, G.; Tullborg, E.L.

    1989-11-01

    Carbonate samples from open fractures in crystalline rock from the Klipperaas study site have been analysed for their 14 C contents using accelerator mass spectrometry. This technique makes it possible to analyse very small carbonate samples (c. 1 mg C). The analyses show low but varying contents of 14 C. However, contamination by CO 2 have taken place affecting small samples more than others. Attempts have been made to quantify the contamination and thus evaluate the analyses of the fracture samples. The obtained low 14 C values can be due to: 1. An effective retention of 14 C by sorption/fractionation forcing 14 C onto the calcite surfaces in the near-surface zone which means that the 14 C contribution to the deeper levels is diminished or 2. the penetration depth of surface groundwater is very shallow. The former is suggested as more probable based on evaluations of the hydrochemical conditions and the fracture mineral studies. (10 figs., 3 tabs., 9 refs.) (authors)

  9. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  10. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    Science.gov (United States)

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  11. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  12. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  13. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  15. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  16. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  17. Methods for analysing cardiovascular studies with repeated measures

    NARCIS (Netherlands)

    Cleophas, T. J.; Zwinderman, A. H.; van Ouwerkerk, B. M.

    2009-01-01

    Background. Repeated measurements in a single subject are generally more similar than unrepeated measurements in different subjects. Unrepeated analyses of repeated data cause underestimation of the treatment effects. Objective. To review methods adequate for the analysis of cardiovascular studies

  18. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    Milani, Gabriele; Valente, Marco

    2014-01-01

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  19. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  20. Conducting Meta-Analyses Based on p Values

    Science.gov (United States)

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  1. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  2. Cost-of-illness studies and cost-effectiveness analyses in anxiety disorders: a systematic review.

    Science.gov (United States)

    Konnopka, Alexander; Leichsenring, Falk; Leibing, Eric; König, Hans-Helmut

    2009-04-01

    To review cost-of-illness studies (COI) and cost-effectiveness analyses (CEA) conducted for anxiety disorders. Based on a database search in Pubmed, PsychINFO and NHS EED, studies were classified according to various criteria. Cost data were inflated and converted to 2005 US-$ purchasing power parities (PPP). We finally identified 20 COI and 11 CEA of which most concentrated on panic disorder (PD) and generalized anxiety disorder (GAD). Differing inclusion of cost categories limited comparability of COI. PD and GAD tended to show higher direct costs per case, but lower direct cost per inhabitant than social and specific phobias. Different measures of effectiveness severely limited comparability of CEA. Overall CEA analysed 26 therapeutic or interventional strategies mostly compared to standard treatment, 8 of them resulting in lower better effectiveness and costs than the comparator. Anxiety disorders cause considerable costs. More research on phobias, more standardised inclusion of cost categories in COI and a wider use of comparable effectiveness measures (like QALYs) in CEA is needed.

  3. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos

    2011-01-01

    present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case......Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...

  4. Deciphering chicken gut microbial dynamics based on high-throughput 16S rRNA metagenomics analyses.

    Science.gov (United States)

    Mohd Shaufi, Mohd Asrore; Sieo, Chin Chin; Chong, Chun Wie; Gan, Han Ming; Ho, Yin Wan

    2015-01-01

    Chicken gut microbiota has paramount roles in host performance, health and immunity. Understanding the topological difference in gut microbial community composition is crucial to provide knowledge on the functions of each members of microbiota to the physiological maintenance of the host. The gut microbiota profiling of the chicken was commonly performed previously using culture-dependent and early culture-independent methods which had limited coverage and accuracy. Advances in technology based on next-generation sequencing (NGS), offers unparalleled coverage and depth in determining microbial gut dynamics. Thus, the aim of this study was to investigate the ileal and caecal microbiota development as chicken aged, which is important for future effective gut modulation. Ileal and caecal contents of broiler chicken were extracted from 7, 14, 21 and 42-day old chicken. Genomic DNA was then extracted and amplified based on V3 hyper-variable region of 16S rRNA. Bioinformatics, ecological and statistical analyses such as Principal Coordinate Analysis (PCoA) was performed in mothur software and plotted using PRIMER 6. Additional analyses for predicted metagenomes were performed through PICRUSt and STAMP software package based on Greengenes databases. A distinctive difference in bacterial communities was observed between ilea and caeca as the chicken aged (P microbial communities in the caeca were more diverse in comparison to the ilea communities. The potentially pathogenic bacteria such as Clostridium were elevated as the chicken aged and the population of beneficial microbe such as Lactobacillus was low at all intervals. On the other hand, based on predicted metagenomes analysed, clear distinction in functions and roles of gut microbiota such as gene pathways related to nutrient absorption (e.g. sugar and amino acid metabolism), and bacterial proliferation and colonization (e.g. bacterial motility proteins, two-component system and bacterial secretion system) were

  5. Complementary Exploratory and Confirmatory Factor Analyses of the French WISC-V: Analyses Based on the Standardization Sample.

    Science.gov (United States)

    Lecerf, Thierry; Canivez, Gary L

    2017-12-28

    Interpretation of the French Wechsler Intelligence Scale for Children-Fifth Edition (French WISC-V; Wechsler, 2016a) is based on a 5-factor model including Verbal Comprehension (VC), Visual Spatial (VS), Fluid Reasoning (FR), Working Memory (WM), and Processing Speed (PS). Evidence for the French WISC-V factorial structure was established exclusively through confirmatory factor analyses (CFAs). However, as recommended by Carroll (1995); Reise (2012), and Brown (2015), factorial structure should derive from both exploratory factor analysis (EFA) and CFA. The first goal of this study was to examine the factorial structure of the French WISC-V using EFA. The 15 French WISC-V primary and secondary subtest scaled scores intercorrelation matrix was used and factor extraction criteria suggested from 1 to 4 factors. To disentangle the contribution of first- and second-order factors, the Schmid and Leiman (1957) orthogonalization transformation (SLT) was applied. Overall, no EFA evidence for 5 factors was found. Results indicated that the g factor accounted for about 67% of the common variance and that the contributions of the first-order factors were weak (3.6 to 11.9%). CFA was used to test numerous alternative models. Results indicated that bifactor models produced better fit to these data than higher-order models. Consistent with previous studies, findings suggested dominance of the general intelligence factor and that users should thus emphasize the Full Scale IQ (FSIQ) when interpreting the French WISC-V. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer : Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    NARCIS (Netherlands)

    Khankari, Nikhil K.; Shu, Xiao Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Eeles, Rosalind A.; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei; Blalock, Kendra; Campbell, Peter T.; Casey, Graham; Conti, David V.; Edlund, Christopher K.; Figueiredo, Jane; James Gauderman, W.; Gong, Jian; Green, Roger C.; Harju, John F.; Harrison, Tabitha A.; Jacobs, Eric J.; Jenkins, Mark A.; Jiao, Shuo; Li, Li; Lin, Yi; Manion, Frank J.; Moreno, Victor; Mukherjee, Bhramar; Raskin, Leon; Schumacher, Fredrick R.; Seminara, Daniela; Severi, Gianluca; Stenzel, Stephanie L.; Thomas, Duncan C.; Hopper, John L.; Southey, Melissa C.; Makalic, Enes; Schmidt, Daniel F.; Fletcher, Olivia; Peto, Julian; Gibson, Lorna; dos Santos Silva, Isabel; Ahsan, Habib; Whittemore, Alice; Waisfisz, Quinten; Meijers-Heijboer, Hanne; Adank, Muriel; van der Luijt, Rob B.; Uitterlinden, Andre G.; Hofman, Albert; Meindl, Alfons; Schmutzler, Rita K.; Müller-Myhsok, Bertram; Lichtner, Peter; Nevanlinna, Heli; Muranen, Taru A.; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Hein, Rebecca; Dahmen, Norbert; Beckman, Lars; Crisponi, Laura; Hall, Per; Czene, Kamila; Irwanto, Astrid; Liu, Jianjun; Easton, Douglas F.; Turnbull, Clare; Rahman, Nazneen; Eeles, Rosalind; Kote-Jarai, Zsofia; Muir, Kenneth; Giles, Graham; Neal, David; Donovan, Jenny L.; Hamdy, Freddie C.; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher; Schumacher, Fred; Travis, Ruth; Riboli, Elio; Hunter, David; Gapstur, Susan; Berndt, Sonja; Chanock, Stephen; Han, Younghun; Su, Li; Wei, Yongyue; Hung, Rayjean J.; Brhane, Yonathan; McLaughlin, John; Brennan, Paul; McKay, James D.; Rosenberger, Albert; Houlston, Richard S.; Caporaso, Neil; Teresa Landi, Maria; Heinrich, Joachim; Wu, Xifeng; Ye, Yuanqing; Christiani, David C.

    2016-01-01

    Background: Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using

  7. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  8. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    International Nuclear Information System (INIS)

    Kitchen, Marcus J.; Pavlov, Konstantin M.; Hooper, Stuart B.; Vine, David J.; Siu, Karen K.W.; Wallace, Megan J.; Siew, Melissa L.L.; Yagi, Naoto; Uesugi, Kentaro; Lewis, Rob A.

    2008-01-01

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 μm thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution

  9. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kitchen, Marcus J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: Marcus.Kitchen@sci.monash.edu.au; Pavlov, Konstantin M. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia); Physics and Electronics, School of Science and Technology, University of New England, NSW 2351 (Australia)], E-mail: Konstantin.Pavlov@sci.monash.edu.au; Hooper, Stuart B. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Stuart.Hooper@med.monash.edu.au; Vine, David J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: David.Vine@sci.monash.edu.au; Siu, Karen K.W. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Karen.Siu@sci.monash.edu.au; Wallace, Megan J. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Megan.Wallace@med.monash.edu.au; Siew, Melissa L.L. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Melissa.Siew@med.monash.edu.au; Yagi, Naoto [SPring-8/JASRI, Sayo (Japan)], E-mail: yagi@spring8.or.jp; Uesugi, Kentaro [SPring-8/JASRI, Sayo (Japan)], E-mail: ueken@spring8.or.jp; Lewis, Rob A. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Rob.Lewis@sync.monash.edu.au

    2008-12-15

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 {mu}m thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution.

  10. RELAP5 analyses and support of Oconee-1 PTS studies

    International Nuclear Information System (INIS)

    Charlton, T.R.

    1983-01-01

    The integrity of a reactor vessel during a severe overcooling transient with primary system pressurization is a current safety concern and has been identified as an Unresolved Safety Issue(USI) A-49 by the US Nuclear Regulatory Commission (NRC). Resolution of USI A-49, denoted as Pressurized Thermal Shock (PTS), is being examined by the US NRC sponsored PTS integration study. In support of this study, the Idaho National Engineering Laboratory (INEL) has performed RELAP5/MOD1.5 thermal-hydraulic analyses of selected overcooling transients. These transient analyses were performed for the Oconee-1 pressurized water reactor (PWR), which is Babcock and Wilcox designed nuclear steam supply system

  11. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop, such as b...

  12. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  13. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  14. [Clinical research XXIII. From clinical judgment to meta-analyses].

    Science.gov (United States)

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  15. Fossil-based comparative analyses reveal ancient marine ancestry erased by extinction in ray-finned fishes.

    Science.gov (United States)

    Betancur-R, Ricardo; Ortí, Guillermo; Pyron, Robert Alexander

    2015-05-01

    The marine-freshwater boundary is a major biodiversity gradient and few groups have colonised both systems successfully. Fishes have transitioned between habitats repeatedly, diversifying in rivers, lakes and oceans over evolutionary time. However, their history of habitat colonisation and diversification is unclear based on available fossil and phylogenetic data. We estimate ancestral habitats and diversification and transition rates using a large-scale phylogeny of extant fish taxa and one containing a massive number of extinct species. Extant-only phylogenetic analyses indicate freshwater ancestry, but inclusion of fossils reveal strong evidence of marine ancestry in lineages now restricted to freshwaters. Diversification and colonisation dynamics vary asymmetrically between habitats, as marine lineages colonise and flourish in rivers more frequently than the reverse. Our study highlights the importance of including fossils in comparative analyses, showing that freshwaters have played a role as refuges for ancient fish lineages, a signal erased by extinction in extant-only phylogenies. © 2015 John Wiley & Sons Ltd/CNRS.

  16. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH ?

    Energy Technology Data Exchange (ETDEWEB)

    HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.

    2002-09-19

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.

  17. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer: Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    Science.gov (United States)

    Khankari, Nikhil K.; Shu, Xiao-Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei

    2016-01-01

    Background Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using height-associated genetic variants identified in a genome-wide association study (GWAS), to evaluate the association of adult height with these cancers. Methods and Findings A systematic review of prospective studies was conducted using the PubMed, Embase, and Web of Science databases. Using meta-analyses, results obtained from 62 studies were summarized for the association of a 10-cm increase in height with cancer risk. Mendelian randomization analyses were conducted using summary statistics obtained for 423 genetic variants identified from a recent GWAS of adult height and from a cancer genetics consortium study of multiple cancers that included 47,800 cases and 81,353 controls. For a 10-cm increase in height, the summary relative risks derived from the meta-analyses of prospective studies were 1.12 (95% CI 1.10, 1.15), 1.07 (95% CI 1.05, 1.10), and 1.06 (95% CI 1.02, 1.11) for colorectal, prostate, and lung cancers, respectively. Mendelian randomization analyses showed increased risks of colorectal (odds ratio [OR] = 1.58, 95% CI 1.14, 2.18) and lung cancer (OR = 1.10, 95% CI 1.00, 1.22) associated with each 10-cm increase in genetically predicted height. No association was observed for prostate cancer (OR = 1.03, 95% CI 0.92, 1.15). Our meta-analysis was limited to published studies. The sample size for the Mendelian randomization analysis of colorectal cancer was relatively small, thus affecting the precision of the point estimate. Conclusions Our study provides evidence for a potential causal association of adult height with the risk of colorectal and lung cancers and suggests that certain genetic factors and biological pathways affecting adult height may also affect the

  18. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Melkerud, Per-Arne; Bain, Derek C.; Olsson, Mats T.

    2003-01-01

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO 3 , and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmol c m -2 yr -1 , amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  19. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  20. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  1. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    Science.gov (United States)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  2. Trajectory data analyses for pedestrian space-time activity study.

    Science.gov (United States)

    Qi, Feng; Du, Fei

    2013-02-25

    It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission(1-3). An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data(4). Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling. The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an

  3. A protocol for analysing mathematics teacher educators' practices

    OpenAIRE

    Kuzle , Ana; Biehler , Rolf

    2015-01-01

    International audience; Studying practices in a teaching-learning environment, such as professional development programmes, is a complex and multi-faceted endeavour. While several frameworks exist to help researchers analyse teaching practices, none exist to analyse practices of those who organize professional development programmes, namely mathematics teacher educators. In this paper, based on theoretical as well as empirical results, we present a protocol for capturing different aspects of ...

  4. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  5. Reviewing PSA-based analyses to modify technical specifications at nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Martinez-Guridi, G.; Vesely, W.E.

    1995-12-01

    Changes to Technical Specifications (TSs) at nuclear power plants (NPPs) require review and approval by the United States Nuclear Regulatory Commission (USNRC). Currently, many requests for changes to TSs use analyses that are based on a plant's probabilistic safety assessment (PSA). This report presents an approach to reviewing such PSA-based submittals for changes to TSs. We discuss the basic objectives of reviewing a PSA-based submittal to modify NPP TSs; the methodology of reviewing a TS submittal, and the differing roles of a PSA review, a PSA Computer Code review, and a review of a TS submittal. To illustrate this approach, we discuss our review of changes to allowed outage time (AOT) and surveillance test interval (STI) in the TS for the South Texas Project Nuclear Generating Station. Based on this experience gained, a check-list of items is given for future reviewers; it can be used to verify that the submittal contains sufficient information, and also that the review has addressed the relevant issues. Finally, recommended steps in the review process and the expected findings of each step are discussed

  6. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    Science.gov (United States)

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  7. Homeopathy: meta-analyses of pooled clinical data.

    Science.gov (United States)

    Hahn, Robert G

    2013-01-01

    In the first decade of the evidence-based era, which began in the mid-1990s, meta-analyses were used to scrutinize homeopathy for evidence of beneficial effects in medical conditions. In this review, meta-analyses including pooled data from placebo-controlled clinical trials of homeopathy and the aftermath in the form of debate articles were analyzed. In 1997 Klaus Linde and co-workers identified 89 clinical trials that showed an overall odds ratio of 2.45 in favor of homeopathy over placebo. There was a trend toward smaller benefit from studies of the highest quality, but the 10 trials with the highest Jadad score still showed homeopathy had a statistically significant effect. These results challenged academics to perform alternative analyses that, to demonstrate the lack of effect, relied on extensive exclusion of studies, often to the degree that conclusions were based on only 5-10% of the material, or on virtual data. The ultimate argument against homeopathy is the 'funnel plot' published by Aijing Shang's research group in 2005. However, the funnel plot is flawed when applied to a mixture of diseases, because studies with expected strong treatments effects are, for ethical reasons, powered lower than studies with expected weak or unclear treatment effects. To conclude that homeopathy lacks clinical effect, more than 90% of the available clinical trials had to be disregarded. Alternatively, flawed statistical methods had to be applied. Future meta-analyses should focus on the use of homeopathy in specific diseases or groups of diseases instead of pooling data from all clinical trials. © 2013 S. Karger GmbH, Freiburg.

  8. Chemometrical characterization of four italian rice varieties based on genetic and chemical analyses.

    Science.gov (United States)

    Brandolini, Vincenzo; Coïsson, Jean Daniel; Tedeschi, Paola; Barile, Daniela; Cereti, Elisabetta; Maietti, Annalisa; Vecchiati, Giorgio; Martelli, Aldo; Arlorio, Marco

    2006-12-27

    This paper describes a method for achieving qualitative identification of four rice varieties from two different Italian regions. To estimate the presence of genetic diversity among the four rice varieties, we used polymerase chain reaction-randomly amplified polymorphic DNA (PCR-RAPD) markers, and to elucidate whether a relationship exists between the ground and the specific characteristics of the product, we studied proximate composition, fatty acid composition, mineral content, and total antioxidant capacity. Using principal component analysis on genomic and compositional data, we were able to classify rice samples according to their variety and their district of production. This work also examined the discrimination ability of different parameters. It was found that genomic data give the best discrimination based on varieties, indicating that RAPD assays could be useful in discriminating among closely related species, while compositional analyses do not depend on the genetic characters only but are related to the production area.

  9. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  10. A comparison between geostatistical analyses and sedimentological studies at the Hartbeestfontien gold mine

    International Nuclear Information System (INIS)

    Magri, E.J.

    1978-01-01

    For life-of-mine planning, as well as for short- and medium-term planning of grades and mine layouts, it is extremely important to have a clear understanding of the patterns followed by the distribution of gold and uranium within the mining area. This study is an attempt to reconcile the geostatistical approach to the determination of ore-shoot directions, via an analysis of the spatial distribution of gold and uranium values, with the sedimentological approach, which is based on the direct measurement of geological features. For the routine geostatistical estimation of ore reserves, the Hartebeestfontein gold mine was divided into ll sections. In each of these sections, the ore-shoot directions were calculated for gold and uranium from the anisotropies disclosed by geostatistical variogram analyses. This study presents a comparison of these results with those obtained from direct geological measurements of paleo-current directions. The results suggest that geological and geostatistical studies could be of significant mutual benefit [af

  11. Trend analyses in the health behaviour in school-aged children study

    DEFF Research Database (Denmark)

    Schnohr, Christina W; Molcho, Michal; Rasmussen, Mette

    2015-01-01

    are considered. When analysing trends, researchers must be able to assess whether a change in prevalence is an expression of an actual change in the observed outcome, whether it is a result of methodological artefacts, or whether it is due to changes in the conceptualization of the outcome by the respondents....... CONCLUSION: The article present recommendations to take a number of the considerations into account. The considerations imply methodological challenges, which are core issues in undertaking trend analyses....... collecting data from adolescents aged 11-15 years, on a broad variety of health determinants and health behaviours. RESULTS: A number of methodological challenges have stemmed from the growth of the HBSC-study, in particular given that the study has a focus on monitoring trends. Some of those challenges...

  12. Trial sequential analyses of meta-analyses of complications in laparoscopic vs. small-incision cholecystectomy: more randomized patients are needed

    DEFF Research Database (Denmark)

    Keus, Frederik; Wetterslev, Jørn; Gluud, Christian

    2010-01-01

    Conclusions based on meta-analyses of randomized trials carry a status of "truth." Methodological components may identify trials with systematic errors ("bias"). Trial sequential analysis (TSA) evaluates random errors in meta-analysis. We analyzed meta-analyses on laparoscopic vs. small-incision ......Conclusions based on meta-analyses of randomized trials carry a status of "truth." Methodological components may identify trials with systematic errors ("bias"). Trial sequential analysis (TSA) evaluates random errors in meta-analysis. We analyzed meta-analyses on laparoscopic vs. small...

  13. Loss of Flow Accident (LOFA) analyses using LabView-based NRR simulator

    Energy Technology Data Exchange (ETDEWEB)

    Arafa, Amany Abdel Aziz; Saleh, Hassan Ibrahim [Atomic Energy Authority, Cairo (Egypt). Radiation Engineering Dept.; Ashoub, Nagieb [Atomic Energy Authority, Cairo (Egypt). Reactor Physics Dept.

    2016-12-15

    This paper presents a generic Loss of Flow Accident (LOFA) scenario module which is integrated in the LabView-based simulator to imitate a Nuclear Research Reactor (NRR) behavior for different user defined LOFA scenarios. It also provides analyses of a LOFA of a single fuel channel and its impact on operational transactions and on the behavior of the reactor. The generic LOFA scenario module includes graphs needed to clarify the effects of the LOFA under study. Furthermore, the percentage of the loss of mass flow rate, the mode of flow reduction and the start time and transient time of LOFA are user defined to add flexibility to the LOFA scenarios. The objective of integrating such generic LOFA module is to be able to deal with such incidents and avoid their significant effects. It is also useful in the development of expertise in this area and reducing the operator training and simulations costs. The results of the implemented generic LOFA module agree well with that of COBRA-IIIC code and the earlier guidebook for this series of transients.

  14. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  15. Monte Carlo parameter studies and uncertainty analyses with MCNP5

    International Nuclear Information System (INIS)

    Brown, F. B.; Sweezy, J. E.; Hayes, R.

    2004-01-01

    A software tool called mcnp p study has been developed to automate the setup, execution, and collection of results from a series of MCNP5 Monte Carlo calculations. This tool provides a convenient means of performing parameter studies, total uncertainty analyses, parallel job execution on clusters, stochastic geometry modeling, and other types of calculations where a series of MCNP5 jobs must be performed with varying problem input specifications. (authors)

  16. Study and realization of a beam analyser of high intensity (10610)

    International Nuclear Information System (INIS)

    Perret-Gallix, D.

    1975-01-01

    A beam analyser working under high-beam intensity in the range of 10 6 to 10 10 particles per burst and giving position profile and intensity of this beam is studied. The reasons of this study, the principle of measurement, the construction of hardware and the different tests carried out on the chamber in order to evaluate the main features are related. The analyser is a multi-cellular ionisation chamber or stripe chamber; each cell made by a copper stripe (0.25mm wide) inserted between two high voltage planes (500V) forms a small independent ionisation chamber. This system, working under the on-line control of a mini-computer allows to associate to each event or event group the instantaneous position and profile of the beam [fr

  17. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  18. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    Science.gov (United States)

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  19. Structural changes in Parkinson's disease: voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake.

    Science.gov (United States)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Yamaguchi, Hiroo; Kira, Jun-Ichi; Honda, Hiroshi

    2017-12-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p  90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p based morphometry can detect grey matter changes in Parkinson's disease. • Diffusion tensor imaging can detect white matter changes in Parkinson's disease.

  20. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2013-03-01

    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  1. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  2. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  3. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  4. Voxel-based morphometry analyses of in-vivo MRI in the aging mouse lemur primate

    Directory of Open Access Journals (Sweden)

    Stephen John Sawiak

    2014-05-01

    Full Text Available Cerebral atrophy is one of the most widely brain alterations associated to aging. A clear relationship has been established between age-associated cognitive impairments and cerebral atrophy. The mouse lemur (Microcebus murinus is a small primate used as a model of age-related neurodegenerative processes. It is the first nonhuman primate in which cerebral atrophy has been correlated with cognitive deficits. Previous studies of cerebral atrophy in this model were based on time consuming manual delineation or measurement of selected brain regions from magnetic resonance images (MRI. These measures could not be used to analyse regions that cannot be easily outlined such as the nucleus basalis of Meynert or the subiculum. In humans, morphometric assessment of structural changes with age is generally performed with automated procedures such as voxel-based morphometry (VBM. The objective of our work was to perform user-independent assessment of age-related morphological changes in the whole brain of large mouse lemur populations thanks to VBM. The study was based on the SPMMouse toolbox of SPM 8 and involved thirty mouse lemurs aged from 1.9 to 11.3 years. The automatic method revealed for the first time atrophy in regions where manual delineation is prohibitive (nucleus basalis of Meynert, subiculum, prepiriform cortex, Brodmann areas 13-16, hypothalamus, putamen, thalamus, corpus callosum. Some of these regions are described as particularly sensitive to age-associated alterations in humans. The method revealed also age-associated atrophy in cortical regions (cingulate, occipital, parietal, nucleus septalis, and the caudate. Manual measures performed in some of these regions were in good agreement with results from automatic measures. The templates generated in this study as well as the toolbox for SPM8 can be downloaded. These tools will be valuable for future evaluation of various treatments that are tested to modulate cerebral aging in lemurs.

  5. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  6. PC based 8K multichannel analyser for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Jain, S.K.; Gupta, J.D.; Suman Kumari, B.

    1989-01-01

    An IBM-PC based 8K multichannel analyser(MCA) has been developed which incorporates all the features of an advanced system like very high throughput for data acquisition in PHA as well as MCS modes, fast real-time display, extensive display manipulation facilities, various present controls and concurrent data processing. The compact system hardware consists of a 2 bit wide NIM module and a PC add-on card. Because of external acquisition hardware, the system after initial programming by PC can acquire data independently allowing the PC to be switched off. To attain very high throughput, the most desirable feature of an MCA, a dual-port memory architecture has been used. The asymmetric dual-port RAM, housed in the NIM module offers 24 bit parallel access to the ADC and 8 bit wide access to PC which results in fast real-time histogramic display on the monitor. PC emulation software is menu driven and user friendly. It integrates a comprehensive set of commonly required application routines for concurrent data processing. After the transfer of know-how to the Electronic Corporation of India Ltd. (ECIL), this system is bein g produced at ECIL. (author). 5 refs., 4 figs

  7. Clinical Research That Matters: Designing Outcome-Based Research for Older Adults to Qualify for Systematic Reviews and Meta-Analyses.

    Science.gov (United States)

    Lee, Jeannie K; Fosnight, Susan M; Estus, Erica L; Evans, Paula J; Pho, Victoria B; Reidt, Shannon; Reist, Jeffrey C; Ruby, Christine M; Sibicky, Stephanie L; Wheeler, Janel B

    2018-01-01

    Though older adults are more sensitive to the effects of medications than their younger counterparts, they are often excluded from manufacturer-based clinical studies. Practice-based research is a practical method to identify medication-related effects in older patients. This research also highlights the role of a pharmacist in improving care in this population. A single study rarely has strong enough evidence to change geriatric practice, unless it is a large-scale, multisite, randomized controlled trial that specifically targets older adults. It is important to design studies that may be used in systematic reviews or meta-analyses that build a stronger evidence base. Recent literature has documented a gap in advanced pharmacist training pertaining to research skills. In this paper, we hope to fill some of the educational gaps related to research in older adults. We define best practices when deciding on the type of study, inclusion and exclusion criteria, design of the intervention, how outcomes are measured, and how results are reported. Well-designed studies increase the pool of available data to further document the important role that pharmacists have in optimizing care of older patients.

  8. The Quality of Cost-Utility Analyses in Orthopedic Trauma.

    Science.gov (United States)

    Nwachukwu, Benedict U; Schairer, William W; O'Dea, Evan; McCormick, Frank; Lane, Joseph M

    2015-08-01

    As health care in the United States transitions toward a value-based model, there is increasing interest in applying cost-effectiveness analysis within orthopedic surgery. Orthopedic trauma care has traditionally underemphasized economic analysis. The goals of this review were to identify US-based cost-utility analysis in orthopedic trauma, to assess the quality of the available evidence, and to identify cost-effective strategies within orthopedic trauma. Based on a review of 971 abstracts, 8 US-based cost-utility analyses evaluating operative strategies in orthopedic trauma were identified. Study findings were recorded, and the Quality of Health Economic Studies (QHES) instrument was used to grade the overall quality. Of the 8 studies included in this review, 4 studies evaluated hip and femur fractures, 3 studies analyzed upper extremity fractures, and 1 study assessed open tibial fracture management. Cost-effective interventions identified in this review include total hip arthroplasty (over hemiarthroplasty) for femoral neck fractures in the active elderly, open reduction and internal fixation (over nonoperative management) for distal radius and scaphoid fractures, limb salvage (over amputation) for complex open tibial fractures, and systems-based interventions to prevent delay in hip fracture surgery. The mean QHES score of the studies was 79.25 (range, 67-89). Overall, there is a paucity of cost-utility analyses in orthopedic trauma; however, the available evidence suggests that certain operative interventions can be cost-effective. The quality of these studies, however, is fair, based on QHES grading. More attention should be paid to evaluating the cost-effectiveness of operative intervention in orthopedic trauma. Copyright 2015, SLACK Incorporated.

  9. Flow-Based Systems for Rapid and High-Precision Enzyme Kinetics Studies

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Enzyme kinetics studies normally focus on the initial rate of enzymatic reaction. However, the manual operation of steps of the conventional enzyme kinetics method has some drawbacks. Errors can result from the imprecise time control and time necessary for manual changing the reaction cuvettes into and out of the detector. By using the automatic flow-based analytical systems, enzyme kinetics studies can be carried out at real-time initial rate avoiding the potential errors inherent in manual operation. Flow-based systems have been developed to provide rapid, low-volume, and high-precision analyses that effectively replace the many tedious and high volume requirements of conventional wet chemistry analyses. This article presents various arrangements of flow-based techniques and their potential use in future enzyme kinetics applications.

  10. Coordinate based random effect size meta-analysis of neuroimaging studies.

    Science.gov (United States)

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  12. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  13. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  14. Altered Brain Activity in Unipolar Depression Revisited: Meta-analyses of Neuroimaging Studies.

    Science.gov (United States)

    Müller, Veronika I; Cieslik, Edna C; Serbanescu, Ilinca; Laird, Angela R; Fox, Peter T; Eickhoff, Simon B

    2017-01-01

    major depressive disorder. For meta-analyses with a minimum of 17 experiments available, separate analyses were performed for increases and decreases. In total, 57 studies with 99 individual neuroimaging experiments comprising in total 1058 patients were included; 34 of them tested cognitive and 65 emotional processing. Overall analyses across cognitive processing experiments (P > .29) and across emotional processing experiments (P > .47) revealed no significant results. Similarly, no convergence was found in analyses investigating positive (all P > .15), negative (all P > .76), or memory (all P > .48) processes. Analyses that restricted inclusion of confounds (eg, medication, comorbidity, age) did not change the results. Inconsistencies exist across individual experiments investigating aberrant brain activity in UD and replication problems across previous neuroimaging meta-analyses. For individual experiments, these inconsistencies may relate to use of uncorrected inference procedures, differences in experimental design and contrasts, or heterogeneous clinical populations; meta-analytically, differences may be attributable to varying inclusion and exclusion criteria or rather liberal statistical inference approaches.

  15. Design and development of microcontroller-based clinical chemistry analyser for measurement of various blood biochemistry parameters.

    Science.gov (United States)

    Taneja, S R; Gupta, R C; Kumar, Jagdish; Thariyan, K K; Verma, Sanjeev

    2005-01-01

    Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.

  16. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  17. Individual-based versus aggregate meta-analysis in multi-database studies of pregnancy outcomes

    DEFF Research Database (Denmark)

    Selmer, Randi; Haglund, Bengt; Furu, Kari

    2016-01-01

    Purpose: Compare analyses of a pooled data set on the individual level with aggregate meta-analysis in a multi-database study. Methods: We reanalysed data on 2.3 million births in a Nordic register based cohort study. We compared estimated odds ratios (OR) for the effect of selective serotonin...... covariates in the pooled data set, and 1.53 (1.19–1.96) after country-optimized adjustment. Country-specific adjusted analyses at the substance level were not possible for RVOTO. Conclusion: Results of fixed effects meta-analysis and individual-based analyses of a pooled dataset were similar in this study...... reuptake inhibitors (SSRI) and venlafaxine use in pregnancy on any cardiovascular birth defect and the rare outcome right ventricular outflow tract obstructions (RVOTO). Common covariates included maternal age, calendar year, birth order, maternal diabetes, and co-medication. Additional covariates were...

  18. Phylogenetic study on Shiraia bambusicola by rDNA sequence analyses.

    Science.gov (United States)

    Cheng, Tian-Fan; Jia, Xiao-Ming; Ma, Xiao-Hang; Lin, Hai-Ping; Zhao, Yu-Hua

    2004-01-01

    In this study, 18S rDNA and ITS-5.8S rDNA regions of four Shiraia bambusicola isolates collected from different species of bamboos were amplified by PCR with universal primer pairs NS1/NS8 and ITS5/ITS4, respectively, and sequenced. Phylogenetic analyses were conducted on three selected datasets of rDNA sequences. Maximum parsimony, distance and maximum likelihood criteria were used to infer trees. Morphological characteristics were also observed. The positioning of Shiraia in the order Pleosporales was well supported by bootstrap, which agreed with the placement by Amano (1980) according to their morphology. We did not find significant inter-hostal differences among these four isolates from different species of bamboos. From the results of analyses and comparison of their rDNA sequences, we conclude that Shiraia should be classified into Pleosporales as Amano (1980) proposed and suggest that it might be positioned in the family Phaeosphaeriaceae. Copyright 2004 WILEY-VCH Verlag GmbH & Co.

  19. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  20. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    for NPP V-1 Bohunice and on review of the impact of the modelling of selected components to the results of calculation safety analysis (a sensitivity study for NPP Mochovce). In 2001 UJD joined a new European project Alternative Approaches to the Safety Performance Indicators. The project is aimed at the information collecting and determining of approaches and recommendations for implementation of the risk oriented indicators, identification of the impact of the safety culture level and organisational culture on safety and applying of indicators to the needs of regulators and operators. In frame of the PHARE project UJD participated in the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213 units. The main results of the analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated within the project. Possible severe accident mitigation and preventative measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. The obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project which makes a part of the 5 th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2001 the ASTEC computer code was tested on different platforms. The results of the testing are summarised in the technical report of EC issued in September 2001. Further activities within this project were focused on performing of selected accident scenarios analyses and comparison of the obtained results with the analyses realised with the help of other computer codes. The work on the project will continue in 2002. In 2001 a groundwork on establishing the Centre for Nuclear Safety in Central and Eastern Europe (CENS), the seat of which is going to be in Bratislava, has continued. The

  1. TAXONOMY AND GENETIC RELATIONSHIPS OF PANGASIIDAE, ASIAN CATFISHES, BASED ON MORPHOLOGICAL AND MOLECULAR ANALYSES

    Directory of Open Access Journals (Sweden)

    Rudhy Gustiano

    2007-12-01

    Full Text Available Pangasiids are economically important riverine catfishes generally residing in freshwater from the Indian subcontinent to the Indonesian Archipelago. The systematics of this family are still poorly known. Consequently, lack of such basic information impedes the understanding of the biology of the Pangasiids and the study of their aquaculture potential as well as improvement of seed production and growth performance. The objectives of the present study are to clarify phylogeny of this family based on a biometric analysis and molecular evidence using 12S ribosomal mtDNA on the total of 1070 specimens. The study revealed that 28 species are recognised as valid in Pangasiidae. Four genera are also recognized as Helicophagus Bleeker 1858, Pangasianodon Chevey 1930, Pteropangasius Fowler 1937, and Pangasius Valenciennes 1840 instead of two as reported by previous workers. The phylogenetic analysis demonstrated the recognised genera, and genetic relationships among taxa. Overall, trees from the different analyses show similar topologies and confirm the hypothesis derived from geological history, palaeontology, and similar models in other taxa of fishes from the same area. The oldest genus may already have existed when the Asian mainland was still connected to the islands in the southern part about 20 million years ago.

  2. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  3. Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-Rich DNA, and nuclear DNA analyses

    Science.gov (United States)

    Freeman, S.; Pham, M.; Rodriguez, R.J.

    1993-01-01

    Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-rich DNA, and nuclear DNA analyses. Experimental Mycology 17, 309-322. Isolates of Colletotrichum were grouped into 10 separate species based on arbitrarily primed PCR (ap-PCR), A + T-rich DNA (AT-DNA) and nuclear DNA banding patterns. In general, the grouping of Colletotrichum isolates by these molecular approaches corresponded to that done by classical taxonomic identification, however, some exceptions were observed. PCR amplification of genomic DNA using four different primers allowed for reliable differentiation between isolates of the 10 species. HaeIII digestion patterns of AT-DNA also distinguished between species of Colletotrichum by generating species-specific band patterns. In addition, hybridization of the repetitive DNA element (GcpR1) to genomic DNA identified a unique set of Pst 1-digested nuclear DNA fragments in each of the 10 species of Colletotrichum tested. Multiple isolates of C. acutatum, C. coccodes, C. fragariae, C. lindemuthianum, C. magna, C. orbiculare, C. graminicola from maize, and C. graminicola from sorghum showed 86-100% intraspecies similarity based on ap-PCR and AT-DNA analyses. Interspecies similarity determined by ap-PCR and AT-DNA analyses varied between 0 and 33%. Three distinct banding patterns were detected in isolates of C. gloeosporioides from strawberry. Similarly, three different banding patterns were observed among isolates of C. musae from diseased banana.

  4. The Hanford study: issues in analysing and interpreting data from occupational studies

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1987-01-01

    Updated analyses of workers at the Hanford Site provided no evidence of a correlation of radiation exposure and mortality from all cancers or mortality from leukemia. Potentially confounding factors were examined, and to the extent possible taken account of in these analyses. Risk estimates for leukemia and for all cancers except leukemia were calculated and compared with those from other sources. For leukemia, consideration was given to modifying factors such as age at exposure and time from exposure. (author)

  5. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  6. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  7. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  8. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  9. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  10. Groundwater flow analyses in Japan. 1. Case studies in Hokkaido and Northeast Japan

    International Nuclear Information System (INIS)

    Inaba, Hideo; Maekawa, Keisuke; Koide, Kaoru; Yanagizawa, Koichi

    1995-01-01

    An extensive study program has been carried out to estimate hydrogeological characteristics of deep underground in Japan. As a part of this program, groundwater flow analysis in Hokkaido and Northeast Japan were conducted. For the analyses of these area, hydrogeological models representing topography, geology, distribution of hydraulic conductivity were developed using available informations from open literature. By use of these models, steady state three-dimensional groundwater flow under a saturated/unsaturated condition was calculated by means of finite element method. The results are as follows: (1) Distribution of piezometric head corresponds with topography in the study area. (2) Piezometric head distribution is hydrostatic below E.L.-1000m in the study area. (3) Hydraulic gradient in the study area is less than 0.04 below E.L.-500m. (4) Difference of boundary conditions at the shore side of these models does not affect the results of the analyses. (author)

  11. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  12. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  13. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  14. PALEO-CHANNELS OF SINGKAWANG WATERS WEST KALIMANTAN AND ITS RELATION TO THE OCCURRENCES OF SUB-SEABOTTOM GOLD PLACERS BASED ON STRATA BOX SEISMIC RECORD ANALYSES

    Directory of Open Access Journals (Sweden)

    Hananto Kurnio

    2017-07-01

    Full Text Available Strata box seismic records were used to analyze sub-seabottom paleochannels in Singkawang Waters, West Kalimantan. Based on the analyses, it can be identified the distribution and patterns of paleochannels. Paleo channel at northern part of study area interpreted as a continuation of Recent coastal rivers; and at the southern part, the pattern radiates surround the cone-shaped morphology of islands, especially Kabung and Lemukutan Islands. Paleochannels of the study area belong to northwest Sunda Shelf systems that terminated to the South China Sea. A study on sequence stratigraphy was carried out to better understanding sedimentary sequences in the paleochannels. This study is also capable of identifying placer deposits within the channels. Based on criterias of gold placer occurrence such as existence of primary gold sources, intense chemical and physical weathering to liberate gold grains from their source rocks of Sintang Intrusive. Gravity transportation that involved water media, stable bed rock and surface conditions, caused offshore area of Singkawang fulfill requirements for gold placer accumulations. Chemical and physical whethering proccesses from Oligocene to Recent, approximately 36 million, might be found accumulation of gold placer on the seafloor. Based on grain size analyses, the study area consisted of sand 43.4%, silt 54.3% and clay 2.3%. Petrographic examination of the sample shows gold grains about 0.2%.

  15. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  16. Performance study of Ke factors in simplified elastic plastic fatigue analyses with emphasis on thermal cyclic loading

    International Nuclear Information System (INIS)

    Lang, Hermann; Rudolph, Juergen; Ziegler, Rainer

    2011-01-01

    As code-based fully elastic plastic code conforming fatigue analyses are still time consuming, simplified elastic plastic analysis is often applied. This procedure is known to be overly conservative for some conditions due to the applied plastification (penalty) factor K e . As a consequence, less conservative fully elastic plastic fatigue analyses based on non-linear finite element analyses (FEA) or simplified elastic plastic analysis based on more realistic K e factors have to be used for fatigue design. The demand for more realistic K e factors is covered as a requirement of practical fatigue analysis. Different code-based K e procedures are reviewed in this paper with special regard to performance under thermal cyclic loading conditions. Other approximation formulae such as those by Neuber, Seeger/Beste or Kuehnapfel are not evaluated in this context because of their applicability to mechanical loading excluding thermal cyclic loading conditions typical for power plant operation. Besides the current code-based K e corrections, the ASME Code Case N-779 (e.g. Adam's proposal) and its modification in ASME Section VIII is considered. Comparison of elastic plastic results and results from the Rules for Nuclear Facility Components and Rules for Pressure Vessels reveals a considerable overestimation of usage factor in the case of ASME III and KTA 3201.2 for the examined examples. Usage factors according to RCC-M, Adams (ASME Code Case N-779), ASME VIII (alternative) and EN 13445-3 are essentially comparable and less conservative for these examples. The K v correction as well as the applied yield criterion (Tresca or von Mises) essentially influence the quality of the more advanced plasticity corrections (e.g. ASME Code Case N-779 and RCC-M). Hence, new proposals are based on a refined K v correction.

  17. In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy

    Data.gov (United States)

    U.S. Environmental Protection Agency — In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy showing spectral fitting and linear...

  18. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses; Effets de l'age et du genre sur la perfusion cerebrale regionale etudiee par deux methodes d'analyse statistique voxel-par-voxel

    Energy Technology Data Exchange (ETDEWEB)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T. [Universite Catholique de Louvain, Service de Medecine Nucleaire, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); Van Laere, K. [Leuven Univ. Hospital, Nuclear Medicine Div. (Belgium); Jamart, J. [Universite Catholique de Louvain, Dept. de Biostatistiques, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); D' Asseler, Y. [Ghent Univ., Medical Signal and Image Processing Dept. (MEDISIP), Faculty of applied sciences (Belgium); Minoshima, S. [Washington Univ., Dept. of Radiology, Seattle (United States)

    2009-10-15

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine {sup 99m}Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  19. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  20. [Research on fast classification based on LIBS technology and principle component analyses].

    Science.gov (United States)

    Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng

    2014-11-01

    Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.

  1. Assessing the validity of road safety evaluation studies by analysing causal chains.

    Science.gov (United States)

    Elvik, Rune

    2003-09-01

    This paper discusses how the validity of road safety evaluation studies can be assessed by analysing causal chains. A causal chain denotes the path through which a road safety measure influences the number of accidents. Two cases are examined. One involves chemical de-icing of roads (salting). The intended causal chain of this measure is: spread of salt --> removal of snow and ice from the road surface --> improved friction --> shorter stopping distance --> fewer accidents. A Norwegian study that evaluated the effects of salting on accident rate provides information that describes this causal chain. This information indicates that the study overestimated the effect of salting on accident rate, and suggests that this estimate is influenced by confounding variables the study did not control for. The other case involves a traffic club for children. The intended causal chain in this study was: join the club --> improve knowledge --> improve behaviour --> reduce accident rate. In this case, results are rather messy, which suggests that the observed difference in accident rate between members and non-members of the traffic club is not primarily attributable to membership in the club. The two cases show that by analysing causal chains, one may uncover confounding factors that were not adequately controlled in a study. Lack of control for confounding factors remains the most serious threat to the validity of road safety evaluation studies.

  2. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  3. Sampling strategies for improving tree accuracy and phylogenetic analyses: a case study in ciliate protists, with notes on the genus Paramecium.

    Science.gov (United States)

    Yi, Zhenzhen; Strüder-Kypke, Michaela; Hu, Xiaozhong; Lin, Xiaofeng; Song, Weibo

    2014-02-01

    In order to assess how dataset-selection for multi-gene analyses affects the accuracy of inferred phylogenetic trees in ciliates, we chose five genes and the genus Paramecium, one of the most widely used model protist genera, and compared tree topologies of the single- and multi-gene analyses. Our empirical study shows that: (1) Using multiple genes improves phylogenetic accuracy, even when their one-gene topologies are in conflict with each other. (2) The impact of missing data on phylogenetic accuracy is ambiguous: resolution power and topological similarity, but not number of represented taxa, are the most important criteria of a dataset for inclusion in concatenated analyses. (3) As an example, we tested the three classification models of the genus Paramecium with a multi-gene based approach, and only the monophyly of the subgenus Paramecium is supported. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  5. Sensitivity Study of Poisson's Ratio Used in Soil Structure Interaction (SSI) Analyses

    International Nuclear Information System (INIS)

    Han, Seung-ju; You, Dong-Hyun; Jang, Jung-bum; Yun, Kwan-hee

    2016-01-01

    The preliminary review for Design Certification (DC) of APR1400 was accepted by NRC on March 4, 2015. After the acceptance of the application for standard DC of APR1400, KHNP has responded the Request for Additional Information (RAI) raised by NRC to undertake a full design certification review. Design certification is achieved through the NRC's rulemaking process, and is founded on the staff's review of the application, which addresses the various safety issues associated with the proposed nuclear power plant design, independent of a specific site. The USNRC issued RAIs pertain to Design Control Document (DCD) Ch.3.7 'Seismic Design' is DCD Tables 3.7A-1 and 3.7A-2 show Poisson’s ratios in the S1 and S2 soil profiles used for SSI analysis as great as 0.47 and 0.48 respectively. Based on staff experience, use of Poisson's ratio approaching these values may result in numerical instability of the SSI analysis results. Sensitivity study is performed using the ACS SASSI NI model of APR1400 with S1 and S2 soil profiles to demonstrate that the Poisson’s ratio values used in the SSI analyses of S1 and S2 soil profile cases do not produce numerical instabilities in the SSI analysis results. No abrupt changes or spurious peaks, which tend to indicate existence of numerical sensitivities in the SASSI solutions, appear in the computed transfer functions of the original SSI analyses that have the maximum dynamic Poisson’s ratio values of 0.47 and 0.48 as well as in the re-computed transfer functions that have the maximum dynamic Poisson’s ratio values limited to 0.42 and 0.45

  6. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study.

    Science.gov (United States)

    Tam, Wilson W S; Lo, Kenneth K H; Khalechelvam, Parames

    2017-02-07

    Systematic reviews (SRs) often poorly report key information, thereby diminishing their usefulness. Previous studies evaluated published SRs and determined that they failed to meet explicit criteria or characteristics. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement was recommended as a reporting guideline for SR and meta-analysis (MA), but previous studies showed that adherence to the statement was not high for SRs published in different medical fields. Thus, the aims of this study are twofold: (1) to investigate the number of nursing journals that have required or recommended the use of the PRISMA statement for reporting SR, and (2) to examine the adherence of SRs and/or meta-analyses to the PRISMA statement published in nursing journals. A cross-sectional study. Nursing journals listed in the ISI journal citation report were divided into 2 groups based on the recommendation of PRISMA statement in their 'Instruction for Authors'. SRs and meta-analyses published in 2014 were searched in 3 databases. 37 SRs and meta-analyses were randomly selected in each group. The adherence of each item to the PRISMA was examined and summarised using descriptive statistics. The quality of the SRs was assessed by Assessing the Methodological Quality of Systematic Reviews. The differences between the 2 groups were compared using the Mann-Whitney U test. Out of 107 nursing journals, 30 (28.0%) recommended or required authors to follow the PRISMA statement when they submit SRs or meta-analyses. The median rates of adherence to the PRISMA statement for reviews published in journals with and without PRISMA endorsement were 64.9% (IQR: 17.6-92.3%) and 73.0% (IQR: 59.5-94.6%), respectively. No significant difference was observed in any of the items between the 2 groups. The median adherence of SRs and meta-analyses in nursing journals to PRISMA is low at 64.9% and 73.0%, respectively. Nonetheless, the adherence level of nursing journals to the

  7. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study

    Science.gov (United States)

    Tam, Wilson W S; Lo, Kenneth K H; Khalechelvam, Parames

    2017-01-01

    Objective Systematic reviews (SRs) often poorly report key information, thereby diminishing their usefulness. Previous studies evaluated published SRs and determined that they failed to meet explicit criteria or characteristics. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement was recommended as a reporting guideline for SR and meta-analysis (MA), but previous studies showed that adherence to the statement was not high for SRs published in different medical fields. Thus, the aims of this study are twofold: (1) to investigate the number of nursing journals that have required or recommended the use of the PRISMA statement for reporting SR, and (2) to examine the adherence of SRs and/or meta-analyses to the PRISMA statement published in nursing journals. Design A cross-sectional study. Methods Nursing journals listed in the ISI journal citation report were divided into 2 groups based on the recommendation of PRISMA statement in their ‘Instruction for Authors’. SRs and meta-analyses published in 2014 were searched in 3 databases. 37 SRs and meta-analyses were randomly selected in each group. The adherence of each item to the PRISMA was examined and summarised using descriptive statistics. The quality of the SRs was assessed by Assessing the Methodological Quality of Systematic Reviews. The differences between the 2 groups were compared using the Mann-Whitney U test. Results Out of 107 nursing journals, 30 (28.0%) recommended or required authors to follow the PRISMA statement when they submit SRs or meta-analyses. The median rates of adherence to the PRISMA statement for reviews published in journals with and without PRISMA endorsement were 64.9% (IQR: 17.6–92.3%) and 73.0% (IQR: 59.5–94.6%), respectively. No significant difference was observed in any of the items between the 2 groups. Conclusions The median adherence of SRs and meta-analyses in nursing journals to PRISMA is low at 64.9% and 73.0%, respectively

  8. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  9. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies.

    Science.gov (United States)

    Nieuwenhuijsen, Mark J; Dadvand, Payam; Grellier, James; Martinez, David; Vrijheid, Martine

    2013-01-15

    Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently.The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB) exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  10. Environmental risk factors of pregnancy outcomes: a summary of recent meta-analyses of epidemiological studies

    Directory of Open Access Journals (Sweden)

    Nieuwenhuijsen Mark J

    2013-01-01

    Full Text Available Abstract Background Various epidemiological studies have suggested associations between environmental exposures and pregnancy outcomes. Some studies have tempted to combine information from various epidemiological studies using meta-analysis. We aimed to describe the methodologies used in these recent meta-analyses of environmental exposures and pregnancy outcomes. Furthermore, we aimed to report their main findings. Methods We conducted a bibliographic search with relevant search terms. We obtained and evaluated 16 recent meta-analyses. Results The number of studies included in each reported meta-analysis varied greatly, with the largest number of studies available for environmental tobacco smoke. Only a small number of the studies reported having followed meta-analysis guidelines or having used a quality rating system. Generally they tested for heterogeneity and publication bias. Publication bias did not occur frequently. The meta-analyses found statistically significant negative associations between environmental tobacco smoke and stillbirth, birth weight and any congenital anomalies; PM2.5 and preterm birth; outdoor air pollution and some congenital anomalies; indoor air pollution from solid fuel use and stillbirth and birth weight; polychlorinated biphenyls (PCB exposure and birth weight; disinfection by-products in water and stillbirth, small for gestational age and some congenital anomalies; occupational exposure to pesticides and solvents and some congenital anomalies; and agent orange and some congenital anomalies. Conclusions The number of meta-analyses of environmental exposures and pregnancy outcomes is small and they vary in methodology. They reported statistically significant associations between environmental exposures such as environmental tobacco smoke, air pollution and chemicals and pregnancy outcomes.

  11. Molecular systematics of Indian Alysicarpus (Fabaceae) based on analyses of nuclear ribosomal DNA sequences.

    Science.gov (United States)

    Gholami, Akram; Subramaniam, Shweta; Geeta, R; Pandey, Arun K

    2017-06-01

    Alysicarpus Necker ex Desvaux (Fabaceae, Desmodieae) consists of ~30 species that are distributed in tropical and subtropical regions of theworld. In India, the genus is represented by ca. 18 species, ofwhich seven are endemic. Sequences of the nuclear Internal transcribed spacer from38 accessions representing 16 Indian specieswere subjected to phylogenetic analyses. The ITS sequence data strongly support the monophyly of the genus Alysicarpus. Analyses revealed four major well-supported clades within Alysicarpus. Ancestral state reconstructions were done for two morphological characters, namely calyx length in relation to pod (macrocalyx and microcalyx) and pod surface ornamentation (transversely rugose and nonrugose). The present study is the first report on molecular systematics of Indian Alysicarpus.

  12. Systematic review of model-based analyses reporting the cost-effectiveness and cost-utility of cardiovascular disease management programs.

    Science.gov (United States)

    Maru, Shoko; Byrnes, Joshua; Whitty, Jennifer A; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A

    2015-02-01

    The reported cost effectiveness of cardiovascular disease management programs (CVD-MPs) is highly variable, potentially leading to different funding decisions. This systematic review evaluates published modeled analyses to compare study methods and quality. Articles were included if an incremental cost-effectiveness ratio (ICER) or cost-utility ratio (ICUR) was reported, it is a multi-component intervention designed to manage or prevent a cardiovascular disease condition, and it addressed all domains specified in the American Heart Association Taxonomy for Disease Management. Nine articles (reporting 10 clinical outcomes) were included. Eight cost-utility and two cost-effectiveness analyses targeted hypertension (n=4), coronary heart disease (n=2), coronary heart disease plus stoke (n=1), heart failure (n=2) and hyperlipidemia (n=1). Study perspectives included the healthcare system (n=5), societal and fund holders (n=1), a third party payer (n=3), or was not explicitly stated (n=1). All analyses were modeled based on interventions of one to two years' duration. Time horizon ranged from two years (n=1), 10 years (n=1) and lifetime (n=8). Model structures included Markov model (n=8), 'decision analytic models' (n=1), or was not explicitly stated (n=1). Considerable variation was observed in clinical and economic assumptions and reporting practices. Of all ICERs/ICURs reported, including those of subgroups (n=16), four were above a US$50,000 acceptability threshold, six were below and six were dominant. The majority of CVD-MPs was reported to have favorable economic outcomes, but 25% were at unacceptably high cost for the outcomes. Use of standardized reporting tools should increase transparency and inform what drives the cost-effectiveness of CVD-MPs. © The European Society of Cardiology 2014.

  13. A multi-criteria evaluation system for marine litter pollution based on statistical analyses of OSPAR beach litter monitoring time series.

    Science.gov (United States)

    Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael

    2013-12-01

    During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The interrelation between hypothyroidism and glaucoma: a critical review and meta-analyses.

    Science.gov (United States)

    Thvilum, Marianne; Brandt, Frans; Brix, Thomas Heiberg; Hegedüs, Laszlo

    2017-12-01

    Data on the association between hypothyroidism and glaucoma are conflicting. We sought to shed light on this by conducting a critical review and meta-analyses. The meta-analyses were conducted in adherence with the widely accepted MOOSE guidelines. Using the Medical Subject Heading (MeSH) terms: hypothyroidism, myxoedema and glaucoma or intraocular pressure, case-control studies, cohort studies and cross-sectional studies were identified (PubMed) and reviewed. Using meta-analysis, the relative risk (RR) of coexistence of glaucoma and hypothyroidism was calculated. Based on the literature search, thirteen studies fulfilled the inclusion criteria and could be categorized into two groups based on the exposure. The designs of the studies varied considerably, and there was heterogeneity related to lack of power, weak phenotype classifications and length of follow-up. Eight studies had glaucoma (5757 patients) as exposure and hypothyroidism as outcome. Among these, we found a non-significantly increased risk of hypothyroidism associated with glaucoma (RR 1.65; 95% confidence interval [CI]: 0.97-2.82). Based on five studies (168 006 patients) with hypothyroidism as exposure and glaucoma as outcome, we found the risk of glaucoma to be significantly increased (RR 1.33; 95% CI: 1.13-1.58). Based on these meta-analyses, there seems to be an association between hypothyroidism and glaucoma, which does not seem to be the case between glaucoma and hypothyroidism. However, larger scale studies with better phenotype classification, longer follow-up and taking comorbidity and other biases into consideration are needed to address a potential causal relationship. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  15. Analysing a Web-Based E-Commerce Learning Community: A Case Study in Brazil.

    Science.gov (United States)

    Joia, Luiz Antonio

    2002-01-01

    Demonstrates the use of a Web-based participative virtual learning environment for graduate students in Brazil enrolled in an electronic commerce course in a Masters in Business Administration program. Discusses learning communities; computer-supported collaborative work and collaborative learning; influences on student participation; the role of…

  16. Suicidality and aggression during antidepressant treatment: systematic review and meta-analyses based on clinical study reports.

    Science.gov (United States)

    Sharma, Tarang; Guski, Louise Schow; Freund, Nanna; Gøtzsche, Peter C

    2016-01-27

    To study serious harms associated with selective serotonin and serotonin-norepinephrine reuptake inhibitors.Design Systematic review and meta-analysis. Mortality and suicidality. Secondary outcomes were aggressive behaviour and akathisia. Clinical study reports for duloxetine, fluoxetine, paroxetine, sertraline, and venlafaxine obtained from the European and UK drug regulators, and summary trial reports for duloxetine and fluoxetine from Eli Lilly's website. Double blind placebo controlled trials that contained any patient narratives or individual patient listings of harms. Two researchers extracted data independently; the outcomes were meta-analysed by Peto's exact method (fixed effect model). We included 70 trials (64,381 pages of clinical study reports) with 18,526 patients. These trials had limitations in the study design and discrepancies in reporting, which may have led to serious under-reporting of harms. For example, some outcomes appeared only in individual patient listings in appendices, which we had for only 32 trials, and we did not have case report forms for any of the trials. Differences in mortality (all deaths were in adults, odds ratio 1.28, 95% confidence interval 0.40 to 4.06), suicidality (1.21, 0.84 to 1.74), and akathisia (2.04, 0.93 to 4.48) were not significant, whereas patients taking antidepressants displayed more aggressive behaviour (1.93, 1.26 to 2.95). For adults, the odds ratios were 0.81 (0.51 to 1.28) for suicidality, 1.09 (0.55 to 2.14) for aggression, and 2.00 (0.79 to 5.04) for akathisia. The corresponding values for children and adolescents were 2.39 (1.31 to 4.33), 2.79 (1.62 to 4.81), and 2.15 (0.48 to 9.65). In the summary trial reports on Eli Lilly's website, almost all deaths were noted, but all suicidal ideation events were missing, and the information on the remaining outcomes was incomplete. Because of the shortcomings identified and having only partial access to appendices with no access to case report forms, the harms

  17. Multi-Criteria Analyses of Urban Planning for City Expansion: A Case Study of Zamora, Spain

    Directory of Open Access Journals (Sweden)

    Marco Criado

    2017-10-01

    Full Text Available This study has established a methodology to determine the most environmentally suitable area for the expansion of Zamora (Spain using geographic information system (GIS technology. The objective was to develop a GIS-based methodology for the identification of urban peripheral areas that are suitable for the accommodation of new buildings and services, that are compliant with environmental criteria, and that guarantee an adequate quality of life for the future population such that extra construction costs are avoided. The methodological core is based on two multi-criteria analyses (MCAs: MCA-1 determines areas suitable for building—the most environmentally sustainable areas that do not present risks or discomforts to the population—by analyzing the restrictive factors; MCA-2 takes the sectors that received a favorable evaluation in MCA-1, determines which of those have a lower economic overhead for construction, and analyzes the different conditioning criteria related to their pre-existing infrastructures. Finally, the location of the sectors is determined by a decision factor that satisfies some strategic need of the municipality.

  18. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    Science.gov (United States)

    2016-09-01

    performance study of these algorithms in the particular problem of analysing backscatter signals from rotating blades. The report is organised as follows...provide further insight into the behaviour of the techniques. Here, the algorithms for MP, OMP, CGP, gOMP and ROMP terminate when 10 atoms are

  19. Implication of the cause of differences in 3D structures of proteins with high sequence identity based on analyses of amino acid sequences and 3D structures.

    Science.gov (United States)

    Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi

    2014-09-18

    Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.

  20. Study of thermal-hydraulic analyses with CIP method

    International Nuclear Information System (INIS)

    Doi, Yoshihiro

    1996-09-01

    New type of numerical scheme CIP has been proposed for solving hyperbolic type equations and the CIP is focused on as a less numerical diffusive scheme. C-CUP method with the CIP scheme is adopted to numerical simulations that treat compressible and incompressible fluids, phase change phenomena and Mixture fluids. To evaluate applicabilities of the CIP scheme and C-CUP method for thermal hydraulic analyses related to Fast Breeder Reactors (FBRs), the scheme and the method were reviewed. Feature of the CIP scheme and procedure of the C-CUP method were presented. The CIP scheme is used to solve linear hyperbolic type equations for advection term in basic equations of fluids. Key issues of the scheme is that profile between grid points is described to solve the equation by cubic polynomial and spatial derivatives of the polynomial. The scheme can capture steep change of solution and suppress numerical error. In the C-CUP method, the basic equations of fluids are divided into advection terms and the other terms. The advection terms is solved with CIP scheme and the other terms is solved with difference method. The C-CUP method is robust for numerical instability, but mass of fluid will be in unfair preservation with nonconservative equations for fluids. Numerical analyses with the CIP scheme and the C-CUP method has been performed for phase change, mixture and moving object. These analyses are depend on characteristics of that the scheme and the method are robust for steep change of density and useful for interface tracking. (author)

  1. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    La Pointe, P.R.; Wallmann, P.; Follin, S.

    1995-09-01

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  2. Evaluation of the Pseudostatic Analyses of Earth Dams Using FE Simulation and Observed Earthquake-Induced Deformations: Case Studies of Upper San Fernando and Kitayama Dams

    Directory of Open Access Journals (Sweden)

    Tohid Akhlaghi

    2014-01-01

    Full Text Available Evaluation of the accuracy of the pseudostatic approach is governed by the accuracy with which the simple pseudostatic inertial forces represent the complex dynamic inertial forces that actually exist in an earthquake. In this study, the Upper San Fernando and Kitayama earth dams, which have been designed using the pseudostatic approach and damaged during the 1971 San Fernando and 1995 Kobe earthquakes, were investigated and analyzed. The finite element models of the dams were prepared based on the detailed available data and results of in situ and laboratory material tests. Dynamic analyses were conducted to simulate the earthquake-induced deformations of the dams using the computer program Plaxis code. Then the pseudostatic seismic coefficient used in the design and analyses of the dams were compared with the seismic coefficients obtained from dynamic analyses of the simulated model as well as the other available proposed pseudostatic correlations. Based on the comparisons made, the accuracy and reliability of the pseudostatic seismic coefficients are evaluated and discussed.

  3. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  4. Stress and deflection analyses of floating roofs based on a load-modifying method

    International Nuclear Information System (INIS)

    Sun Xiushan; Liu Yinghua; Wang Jianbin; Cen Zhangzhi

    2008-01-01

    This paper proposes a load-modifying method for the stress and deflection analyses of floating roofs used in cylindrical oil storage tanks. The formulations of loads and deformations are derived according to the equilibrium analysis of floating roofs. Based on these formulations, the load-modifying method is developed to conduct a geometrically nonlinear analysis of floating roofs with the finite element (FE) simulation. In the procedure with the load-modifying method, the analysis is carried out through a series of iterative computations until a convergence is achieved within the error tolerance. Numerical examples are given to demonstrate the validity and reliability of the proposed method, which provides an effective and practical numerical solution to the design and analysis of floating roofs

  5. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  6. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  7. DESIGNING EAP MATERIALS BASED ON INTERCULTURAL CORPUS ANALYSES: THE CASE OF LOGICAL MARKERS IN RESEARCH ARTICLES

    Directory of Open Access Journals (Sweden)

    Pilar Mur Dueñas

    2009-10-01

    Full Text Available The ultimate aim of intercultural analyses in English for Academic Purposes is to help non-native scholars function successfully in the international disciplinary community in English. The aim of this paper is to show how corpus-based intercultural analyses can be useful to design EAP materials on a particular metadiscourse category, logical markers, in research article writing. The paper first describes the analysis carried out of additive, contrastive and consecutive logical markers in a corpus of research articles in English and in Spanish in a particular discipline, Business Management. Differences were found in their frequency and also in the use of each of the sub-categories. Then, five activities designed on the basis of these results are presented. They are aimed at raising Spanish Business scholars' awareness of the specific uses and pragmatic function of frequent logical markers in international research articles in English.

  8. CVD diamond Brewster window: feasibility study by FEM analyses

    Directory of Open Access Journals (Sweden)

    Vaccaro A.

    2012-09-01

    Full Text Available Chemical vapor deposition (CVD diamond windows are a crucial component in heating and current drive (H&CD applications. In order to minimize the amount of reflected power from the diamond disc, its thickness must match the desired beam wavelength, thus proper targeting of the plasma requires movable beam reflectors. This is the case, for instance, of the ITER electron cyclotron H&CD system. However, looking at DEMO, the higher heat loads and neutron fluxes could make the use of movable parts close to the plasma difficult. The issue might be solved by using gyrotrons able to tune the beam frequency to the desired resonance, but this concept requires transmission windows that work in a given frequency range, such as the Brewster window. It consists of a CVD diamond disc brazed to two copper cuffs at the Brewster angle. The brazing process is carried out at about 800°C and then the temperature is decreased down to room temperature. Diamond and copper have very different thermal expansion coefficients, therefore high stresses build up during the cool down phase that might lead to failure of the disc. Considering also the complex geometry of the window with the skewed position of the disc, analyses are required in the first place to check its feasibility. The cool down phase was simulated by FEM structural analyses for several geometric and constraint configurations of the window. A study of indirect cooling of the window by water was also performed considering a HE11 mode beam. The results are here reported.

  9. A two-channel wave analyser for sounding rockets and satellites

    International Nuclear Information System (INIS)

    Brondz, E.

    1989-04-01

    Studies of low frequency electromagnetic waves, produced originally by lightning discharges penetrating the ionosphere, provide an important source of valuable information about the earth's surrounding plasma. Use of rockets and satellites supported by ground-based observations implies, unique opportunity for measuring in situ a number of parameters simultaneously in order to correlate data from various measurements. However, every rocket experiment has to be designed bearing in mind telemetry limitations and/or short flight duration. Typical flight duration for Norwegian rockets launched from Andoeya Rocket Range is 500 to 600 s. Therefore, the most desired way to use a rocket or satellite is to carry out data analyses on board in real time. Recent achievements in Digital Signal Processing (DSP) technology have made it possible to undertake very complex on board data manipulation. As a part of rocket instrumentation, a DSP based unit able to carry out on board analyses of low frequency electromagnetic waves in the ionosphere has been designed. The unit can be seen as a general purpose computer built on the basis of a fixed-point 16 bit signal processor. The unit is supplied with a program code in order to perform wave analyses on two independent channels simultaneously. The analyser is able to perform 256 point complex fast fourier transformations, and it produce a spectral power desity estimate on both channels every 85 ms. The design and construction of the DSP based unit is described and results from the tests are presented

  10. A case study of discordant overlapping meta-analyses: vitamin d supplements and fracture.

    Directory of Open Access Journals (Sweden)

    Mark J Bolland

    Full Text Available BACKGROUND: Overlapping meta-analyses on the same topic are now very common, and discordant results often occur. To explore why discordant results arise, we examined a common topic for overlapping meta-analyses- vitamin D supplements and fracture. METHODS AND FINDINGS: We identified 24 meta-analyses of vitamin D (with or without calcium and fracture in a PubMed search in October 2013, and analysed a sample of 7 meta-analyses in the highest ranking general medicine journals. We used the AMSTAR tool to assess the quality of the meta-analyses, and compared their methodologies, analytic techniques and results. Applying the AMSTAR tool suggested the meta-analyses were generally of high quality. Despite this, there were important differences in trial selection, data extraction, and analytical methods that were only apparent after detailed assessment. 25 trials were included in at least one meta-analysis. Four meta-analyses included all eligible trials according to the stated inclusion and exclusion criteria, but the other 3 meta-analyses "missed" between 3 and 8 trials, and 2 meta-analyses included apparently ineligible trials. The relative risks used for individual trials differed between meta-analyses for total fracture in 10 of 15 trials, and for hip fracture in 6 of 12 trials, because of different outcome definitions and analytic approaches. The majority of differences (11/16 led to more favourable estimates of vitamin D efficacy compared to estimates derived from unadjusted intention-to-treat analyses using all randomised participants. The conclusions of the meta-analyses were discordant, ranging from strong statements that vitamin D prevents fractures to equally strong statements that vitamin D without calcium does not prevent fractures. CONCLUSIONS: Substantial differences in trial selection, outcome definition and analytic methods between overlapping meta-analyses led to discordant estimates of the efficacy of vitamin D for fracture prevention

  11. Gene Set Analyses of Genome-Wide Association Studies on 49 Quantitative Traits Measured in a Single Genetic Epidemiology Dataset

    Directory of Open Access Journals (Sweden)

    Jihye Kim

    2013-09-01

    Full Text Available Gene set analysis is a powerful tool for interpreting a genome-wide association study result and is gaining popularity these days. Comparison of the gene sets obtained for a variety of traits measured from a single genetic epidemiology dataset may give insights into the biological mechanisms underlying these traits. Based on the previously published single nucleotide polymorphism (SNP genotype data on 8,842 individuals enrolled in the Korea Association Resource project, we performed a series of systematic genome-wide association analyses for 49 quantitative traits of basic epidemiological, anthropometric, or blood chemistry parameters. Each analysis result was subjected to subsequent gene set analyses based on Gene Ontology (GO terms using gene set analysis software, GSA-SNP, identifying a set of GO terms significantly associated to each trait (pcorr < 0.05. Pairwise comparison of the traits in terms of the semantic similarity in their GO sets revealed surprising cases where phenotypically uncorrelated traits showed high similarity in terms of biological pathways. For example, the pH level was related to 7 other traits that showed low phenotypic correlations with it. A literature survey implies that these traits may be regulated partly by common pathways that involve neuronal or nerve systems.

  12. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  13. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  15. Models and analyses for inertial-confinement fusion-reactor studies

    International Nuclear Information System (INIS)

    Bohachevsky, I.O.

    1981-05-01

    This report describes models and analyses devised at Los Alamos National Laboratory to determine the technical characteristics of different inertial confinement fusion (ICF) reactor elements required for component integration into a functional unit. We emphasize the generic properties of the different elements rather than specific designs. The topics discussed are general ICF reactor design considerations; reactor cavity phenomena, including the restoration of interpulse ambient conditions; first-wall temperature increases and material losses; reactor neutronics and hydrodynamic blanket response to neutron energy deposition; and analyses of loads and stresses in the reactor vessel walls, including remarks about the generation and propagation of very short wavelength stress waves. A discussion of analytic approaches useful in integrations and optimizations of ICF reactor systems concludes the report

  16. Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.

    Science.gov (United States)

    Onder, Seyhan; Mutlu, Mert

    2017-09-01

    Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.

  17. Studying distributed cognition of simulation-based team training with DiCoT.

    Science.gov (United States)

    Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus

    2016-03-01

    Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.

  18. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  19. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  20. Application of Probabilistic Multiple-Bias Analyses to a Cohort- and a Case-Control Study on the Association between Pandemrix™ and Narcolepsy.

    Directory of Open Access Journals (Sweden)

    Kaatje Bollaerts

    Full Text Available An increase in narcolepsy cases was observed in Finland and Sweden towards the end of the 2009 H1N1 influenza pandemic. Preliminary observational studies suggested a temporal link with the pandemic influenza vaccine Pandemrix™, leading to a number of additional studies across Europe. Given the public health urgency, these studies used readily available retrospective data from various sources. The potential for bias in such settings was generally acknowledged. Although generally advocated by key opinion leaders and international health authorities, no systematic quantitative assessment of the potential joint impact of biases was undertaken in any of these studies.We applied bias-level multiple-bias analyses to two of the published narcolepsy studies: a pediatric cohort study from Finland and a case-control study from France. In particular, we developed Monte Carlo simulation models to evaluate a potential cascade of biases, including confounding by age, by indication and by natural H1N1 infection, selection bias, disease- and exposure misclassification. All bias parameters were evidence-based to the extent possible.Given the assumptions used for confounding, selection bias and misclassification, the Finnish rate ratio of 13.78 (95% CI: 5.72-28.11 reduced to a median value of 6.06 (2.5th- 97.5th percentile: 2.49-15.1 and the French odds ratio of 5.43 (95% CI: 2.6-10.08 to 1.85 (2.5th-97.5th percentile: 0.85-4.08.We illustrate multiple-bias analyses using two studies on the Pandemrix™-narcolepsy association and advocate their use to better understand the robustness of study findings. Based on our multiple-bias models, the observed Pandemrix™-narcolepsy association consistently persists in the Finnish study. For the French study, the results of our multiple-bias models were inconclusive.

  1. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  2. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  3. Exergy and energy analyses of two different types of PCM based thermal management systems for space air conditioning applications

    International Nuclear Information System (INIS)

    Tyagi, V.V.; Pandey, A.K.; Buddhi, D.; Tyagi, S.K.

    2013-01-01

    Highlights: ► Calcium chloride hexahydrate (CaCl 2 ⋅6H 2 O) as a PCM was used in this study. ► Two different capsulated system (HDPE based panel and balls) were designed. ► The results of CaCl 2 ⋅6H 2 O are very attractive for space air conditioning. ► Energy and exergy analyses for space cooling applications. - Abstract: This communication presents the experimental study of PCM based thermal management systems for space heating and cooling applications using energy and exergy analysis. Two different types of based thermal management system (TMS-I and TMS-II) using calcium chloride hexahydrate as the heat carrier has been designed, fabricated and studied for space heating and cooling applications at a typical climatic zone in India. In the first experimental arrangement the charging of PCM has been carried out with air conditioning system while discharging has been carried out using electric heater for both the thermal management systems. While in the second arrangement the charging of PCM has been carried out by solar energy and the discharging has been carried out by circulating the cooler ambient air during the night time. In the first experiment, TMS-I is found to be more effective than that of TMS-II while it was found to be reverse in the case of second experiment for both the charging and discharging processes not only for energetic but also for the exergetic performances

  4. Review of Ontario Hydro Pickering 'A' and Bruce 'A' nuclear generating stations' accident analyses

    International Nuclear Information System (INIS)

    Serdula, K.J.

    1988-01-01

    Deterministic safety analysis for the Pickering 'A' and Bruce 'A' nuclear generating stations were reviewed. The methodology used in the evaluation and assessment was based on the concept of 'N' critical parameters defining an N-dimensional safety parameter space. The reviewed accident analyses were evaluated and assessed based on their demonstrated safety coverage for credible values and trajectories of the critical parameters within this N-dimensional safety parameter space. The reported assessment did not consider probability of occurrence of event. The reviewed analyses were extensive for potential occurrence of accidents under normal steady-state operating conditions. These analyses demonstrated an adequate assurance of safety for the analyzed conditions. However, even for these reactor conditions, items have been identified for consideration of review and/or further study, which would provide a greater assurance of safety in the event of an accident. Accident analyses based on a plant in a normal transient operating state or in an off-normal condition but within the allowable operating envelope are not as extensive. Improvements in demonstrations and/or justifications of safety upon potential occurrence of accidents would provide further assurance of adequacy of safety under these conditions. Some events under these conditions have not been analyzed because of their judged low probability; however, accident analyses in this area should be considered. Recommendations are presented relating to these items; it is also recommended that further study is needed of the Pickering 'A' special safety systems

  5. Grid Mapping for Spatial Pattern Analyses of Recurrent Urban Traffic Congestion Based on Taxi GPS Sensing Data

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-03-01

    Full Text Available Traffic congestion is one of the most serious problems that impact urban transportation efficiency, especially in big cities. Identifying traffic congestion locations and occurring patterns is a prerequisite for urban transportation managers in order to take proper countermeasures for mitigating traffic congestion. In this study, the historical GPS sensing data of about 12,000 taxi floating cars in Beijing were used for pattern analyses of recurrent traffic congestion based on the grid mapping method. Through the use of ArcGIS software, 2D and 3D maps of the road network congestion were generated for traffic congestion pattern visualization. The study results showed that three types of traffic congestion patterns were identified, namely: point type, stemming from insufficient capacities at the nodes of the road network; line type, caused by high traffic demand or bottleneck issues in the road segments; and region type, resulting from multiple high-demand expressways merging and connecting to each other. The study illustrated that the proposed method would be effective for discovering traffic congestion locations and patterns and helpful for decision makers to take corresponding traffic engineering countermeasures in order to relieve the urban traffic congestion issues.

  6. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0.

    Science.gov (United States)

    Cong, Jing; Liu, Xueduan; Lu, Hui; Xu, Han; Li, Yide; Deng, Ye; Li, Diqiang; Zhang, Yuguang

    2015-09-01

    To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171).

  7. Economic evaluation of algae biodiesel based on meta-analyses

    Science.gov (United States)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  8. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    Science.gov (United States)

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent

  9. Fissure sealants in caries prevention:a practice-based study using survival analysis

    OpenAIRE

    Leskinen, K. (Kaja)

    2010-01-01

    Abstract The purpose of this study was to analyse the effectiveness and cost of fissure sealant treatment in preventing dental caries in children in a practice-based research network using survival analysis. The survival times of first permanent molars in children were analysed in three countries: in Finland (age cohorts 1970–1972 and 1980–1982), in Sweden (1980–1982) and in Greece (1980–1982), and additionally at two municipal health centres in Finland (age cohorts 1988–1990 in Kemi...

  10. Performance study of K{sub e} factors in simplified elastic plastic fatigue analyses with emphasis on thermal cyclic loading

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Hermann, E-mail: hermann.lang@areva.com [AREVA NP GmbH, PEEA-G, Henri-Dunant-Strasse 50, 91058 Erlangen (Germany); Rudolph, Juergen; Ziegler, Rainer [AREVA NP GmbH, PEEA-G, Henri-Dunant-Strasse 50, 91058 Erlangen (Germany)

    2011-08-15

    As code-based fully elastic plastic code conforming fatigue analyses are still time consuming, simplified elastic plastic analysis is often applied. This procedure is known to be overly conservative for some conditions due to the applied plastification (penalty) factor K{sub e}. As a consequence, less conservative fully elastic plastic fatigue analyses based on non-linear finite element analyses (FEA) or simplified elastic plastic analysis based on more realistic K{sub e} factors have to be used for fatigue design. The demand for more realistic K{sub e} factors is covered as a requirement of practical fatigue analysis. Different code-based K{sub e} procedures are reviewed in this paper with special regard to performance under thermal cyclic loading conditions. Other approximation formulae such as those by Neuber, Seeger/Beste or Kuehnapfel are not evaluated in this context because of their applicability to mechanical loading excluding thermal cyclic loading conditions typical for power plant operation. Besides the current code-based K{sub e} corrections, the ASME Code Case N-779 (e.g. Adam's proposal) and its modification in ASME Section VIII is considered. Comparison of elastic plastic results and results from the Rules for Nuclear Facility Components and Rules for Pressure Vessels reveals a considerable overestimation of usage factor in the case of ASME III and KTA 3201.2 for the examined examples. Usage factors according to RCC-M, Adams (ASME Code Case N-779), ASME VIII (alternative) and EN 13445-3 are essentially comparable and less conservative for these examples. The K{sub v} correction as well as the applied yield criterion (Tresca or von Mises) essentially influence the quality of the more advanced plasticity corrections (e.g. ASME Code Case N-779 and RCC-M). Hence, new proposals are based on a refined K{sub v} correction.

  11. Reporting the results of meta-analyses: a plea for incorporating clinical relevance referring to an example.

    Science.gov (United States)

    Bartels, Ronald H M A; Donk, Roland D; Verhagen, Wim I M; Hosman, Allard J F; Verbeek, André L M

    2017-11-01

    The results of meta-analyses are frequently reported, but understanding and interpreting them is difficult for both clinicians and patients. Statistical significances are presented without referring to values that imply clinical relevance. This study aimed to use the minimal clinically important difference (MCID) to rate the clinical relevance of a meta-analysis. This study is a review of the literature. This study is a review of meta-analyses relating to a specific topic, clinical results of cervical arthroplasty. The outcome measure used in the study was the MCID. We performed an extensive literature search of a series of meta-analyses evaluating a similar subject as an example. We searched in Pubmed and Embase through August 9, 2016, and found articles concerning meta-analyses of the clinical outcome of cervical arthroplasty compared with that of anterior cervical discectomy with fusion in cases of cervical degenerative disease. We evaluated the analyses for statistical significance and their relation to MCID. MCID was defined based on results in similar patient groups and a similar disease entity reported in the literature. We identified 21 meta-analyses, only one of which referred to MCID. However, the researchers used an inappropriate measurement scale and, therefore, an incorrect MCID. The majority of the conclusions were based on statistical results without mentioning clinical relevance. The majority of the articles we reviewed drew conclusions based on statistical differences instead of clinical relevance. We recommend introducing the concept of MCID while reporting the results of a meta-analysis, as well as mentioning the explicit scale of the analyzed measurement. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    Science.gov (United States)

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  13. Characteristics of meta-analyses and their component studies in the Cochrane Database of Systematic Reviews: a cross-sectional, descriptive analysis

    Directory of Open Access Journals (Sweden)

    Davey Jonathan

    2011-11-01

    Full Text Available Abstract Background Cochrane systematic reviews collate and summarise studies of the effects of healthcare interventions. The characteristics of these reviews and the meta-analyses and individual studies they contain provide insights into the nature of healthcare research and important context for the development of relevant statistical and other methods. Methods We classified every meta-analysis with at least two studies in every review in the January 2008 issue of the Cochrane Database of Systematic Reviews (CDSR according to the medical specialty, the types of interventions being compared and the type of outcome. We provide descriptive statistics for numbers of meta-analyses, numbers of component studies and sample sizes of component studies, broken down by these categories. Results We included 2321 reviews containing 22,453 meta-analyses, which themselves consist of data from 112,600 individual studies (which may appear in more than one meta-analysis. Meta-analyses in the areas of gynaecology, pregnancy and childbirth (21%, mental health (13% and respiratory diseases (13% are well represented in the CDSR. Most meta-analyses address drugs, either with a control or placebo group (37% or in a comparison with another drug (25%. The median number of meta-analyses per review is six (inter-quartile range 3 to 12. The median number of studies included in the meta-analyses with at least two studies is three (inter-quartile range 2 to 6. Sample sizes of individual studies range from 2 to 1,242,071, with a median of 91 participants. Discussion It is clear that the numbers of studies eligible for meta-analyses are typically very small for all medical areas, outcomes and interventions covered by Cochrane reviews. This highlights the particular importance of suitable methods for the meta-analysis of small data sets. There was little variation in number of studies per meta-analysis across medical areas, across outcome data types or across types of

  14. Peak-flow frequency analyses and results based on data through water year 2011 for selected streamflow-gaging stations in or near Montana: Chapter C in Montana StreamStats

    Science.gov (United States)

    Sando, Steven K.; McCarthy, Peter M.; Dutton, DeAnn M.

    2016-04-05

    Chapter C of this Scientific Investigations Report documents results from a study by the U.S. Geological Survey, in cooperation with the Montana Department of Transportation and the Montana Department of Natural Resources, to provide an update of statewide peak-flow frequency analyses and results for Montana. The purpose of this report chapter is to present peak-flow frequency analyses and results for 725 streamflow-gaging stations in or near Montana based on data through water year 2011. The 725 streamflow-gaging stations included in this study represent nearly all streamflowgaging stations in Montana (plus some from adjacent states or Canadian Provinces) that have at least 10 years of peak-flow records through water year 2011. For 29 of the 725 streamflow-gaging stations, peak-flow frequency analyses and results are reported for both unregulated and regulated conditions. Thus, peak-flow frequency analyses and results are reported for a total of 754 analyses. Estimates of peak-flow magnitudes for 66.7-, 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities are reported. These annual exceedance probabilities correspond to 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals.

  15. Analysing relations between specific and total liking scores

    DEFF Research Database (Denmark)

    Menichelli, Elena; Kraggerud, Hilde; Olsen, Nina Veflen

    2013-01-01

    The objective of this article is to present a new statistical approach for the study of consumer liking. Total liking data are extended by incorporating liking for specific sensory properties. The approach combines different analyses for the purpose of investigating the most important aspects...... of liking and indicating which products are similarly or differently perceived by which consumers. A method based on the differences between total liking and the specific liking variables is proposed for studying both relative differences among products and individual consumer differences. Segmentation...... is also tested out in order to distinguish consumers with the strongest differences in their liking values. The approach is illustrated by a case study, based on cheese data. In the consumer test consumers were asked to evaluate their total liking, the liking for texture and the liking for odour/taste. (C...

  16. How often do sensitivity analyses for economic parameters change cost-utility analysis conclusions?

    Science.gov (United States)

    Schackman, Bruce R; Gold, Heather Taffet; Stone, Patricia W; Neumann, Peter J

    2004-01-01

    -utility threshold above the base-case results (n = 25) were of somewhat higher quality, and were more likely to justify their sensitivity analysis parameters, than those that did not (n = 45), but the overall quality rating was only moderate. Sensitivity analyses for economic parameters are widely reported and often identify whether choosing different assumptions leads to a different conclusion regarding cost effectiveness. Changes in HR-QOL and cost parameters should be used to test alternative guideline recommendations when there is uncertainty regarding these parameters. Changes in discount rates less frequently produce results that would change the conclusion about cost effectiveness. Improving the overall quality of published studies and describing the justifications for parameter ranges would allow more meaningful conclusions to be drawn from sensitivity analyses.

  17. [Health risks in different living circumstances of mothers. Analyses based on a population study].

    Science.gov (United States)

    Sperlich, Stefanie

    2014-12-01

    The objective of this study was to determine the living circumstances ('Lebenslagen') in mothers which are associated with elevated health risks. Data were derived from a cross-sectional population based sample of German women (n = 3129) with underage children. By means of a two-step cluster analysis ten different maternal living circumstances were assessed which proved to be distinct with respect to indicators of socioeconomic position, employment status and family-related factors. Out of the ten living circumstances, one could be attributed to higher socioeconomic status (SES), while five were assigned to a middle SES and four to a lower SES. In line with previous findings, mothers with a high SES predominantly showed the best health while mothers with a low SES tended to be at higher health risk with respect to subjective health, mental health (anxiety and depression), obesity and smoking. However, there were important health differences between the different living circumstances within the middle and lower SES. In addition, varying health risks were found among different living circumstances of single mothers, pointing to the significance of family and job-related living conditions in establishing health risks. With this exploratory analysis strategy small-scale living conditions could be detected which were associated with specific health risks. This approach seemed particularly suitable to provide a more precise definition of target groups for health promotion. The findings encourage a more exrensive application of the concept of living conditions in medical sociology research as well as health monitoring.

  18. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin; Drillon, Gué nola; Ryu, Taewoo; Voolstra, Christian R.; Aranda, Manuel

    2017-01-01

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  19. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin

    2017-09-23

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  20. Using a laser-based CO2 carbon isotope analyser to investigate gas transfer in geological media

    International Nuclear Information System (INIS)

    Guillon, S.; Pili, E.; Agrinier, P.

    2012-01-01

    CO 2 stable carbon isotopes are very attractive in environmental research to investigate both natural and anthropogenic carbon sources. Laser-based CO 2 carbon isotope analysis provides continuous measurement at high temporal resolution and is a promising alternative to isotope ratio mass spectrometry (IRMS). We performed a thorough assessment of a commercially available CO 2 Carbon Isotope Analyser (CCIA DLT-100, Los Gatos Research) that allows in situ measurement of C-13 in CO 2 . Using a set of reference gases of known CO 2 concentration and carbon isotopic composition, we evaluated the precision, long-term stability, temperature sensitivity and concentration dependence of the analyser. Despite good precision calculated from Allan variance (5.0 ppm for CO 2 concentration, and 0.05 per thousand for δC-13 at 60 s averaging), real performances are altered by two main sources of error: temperature sensitivity and dependence of C-13 on CO 2 concentration. Data processing is required to correct for these errors. Following application of these corrections, we achieve an accuracy of 8.7 ppm for CO 2 concentration and 1.3 per thousand for δC-13, which is worse compared to mass spectrometry performance, but still allowing field applications. With this portable analyser we measured CO 2 flux degassed from rock in an underground tunnel. The obtained carbon isotopic composition agrees with IRMS measurement, and can be used to identify the carbon source. (authors)

  1. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  2. A comparative study of cold- and warm-adapted Endonucleases A using sequence analyses and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Davide Michetti

    Full Text Available The psychrophilic and mesophilic endonucleases A (EndA from Aliivibrio salmonicida (VsEndA and Vibrio cholera (VcEndA have been studied experimentally in terms of the biophysical properties related to thermal adaptation. The analyses of their static X-ray structures was no sufficient to rationalize the determinants of their adaptive traits at the molecular level. Thus, we used Molecular Dynamics (MD simulations to compare the two proteins and unveil their structural and dynamical differences. Our simulations did not show a substantial increase in flexibility in the cold-adapted variant on the nanosecond time scale. The only exception is a more rigid C-terminal region in VcEndA, which is ascribable to a cluster of electrostatic interactions and hydrogen bonds, as also supported by MD simulations of the VsEndA mutant variant where the cluster of interactions was introduced. Moreover, we identified three additional amino acidic substitutions through multiple sequence alignment and the analyses of MD-based protein structure networks. In particular, T120V occurs in the proximity of the catalytic residue H80 and alters the interaction with the residue Y43, which belongs to the second coordination sphere of the Mg2+ ion. This makes T120V an amenable candidate for future experimental mutagenesis.

  3. PC based uranium enrichment analyser

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishana, K.R.; Bairi, B.R.

    1991-01-01

    It is important to measure enrichment of unirradiated nuclear fuel elements during production as a quality control measure. An IBM PC based system has recently been tested for enrichment measurements for Nuclear Fuel Complex (NFC), Hyderabad. As required by NFC, the system has ease of calibration. It is easy to switch the system from measuring enrichment of fuel elements to pellets and also automatically store the data and the results. The system uses an IBM PC plug in card to acquire data. The card incorporates programmable interval timers (8253-5). The counter/timer devices are executed by I/O mapped I/O's. A novel algorithm has been incorporated to make the system more reliable. The application software has been written in BASIC. (author). 9 refs., 1 fig

  4. Analyses of the soil surface dynamic of South African Kalahari salt pans based on hyperspectral and multitemporal data

    Science.gov (United States)

    Milewski, Robert; Chabrillat, Sabine; Behling, Robert; Mielke, Christian; Schleicher, Anja Maria; Guanter, Luis

    2016-04-01

    The consequences of climate change represent a major threat to sustainable development and growth in Southern Africa. Understanding the impact on the geo- and biosphere is therefore of great importance in this particular region. In this context the Kalahari salt pans (also known as playas or sabkhas) and their peripheral saline and alkaline habitats are an ecosystem of major interest. They are very sensitive to environmental conditions, and as thus hydrological, mineralogical and ecological responses to climatic variations can be analysed. Up to now the soil composition of salt pans in this area have been only assessed mono-temporally and on a coarse regional scale. Furthermore, the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. High spectral resolution remote sensing can estimate evaporite content and mineralogy of soils based on the analyses of the surface reflectance properties within the Visible-Near InfraRed (VNIR 400-1000 nm) and Short-Wave InfraRed (SWIR 1000-2500 nm) regions. In these wavelength regions major chemical components of the soil interact with the electromagnetic radiation and produce characteristic absorption features that can be used to derive the properties of interest. Although such techniques are well established for the laboratory and field scale, the potential of current (Hyperion) and upcoming spaceborne sensors such as EnMAP for quantitative mineralogical and salt spectral mapping is still to be demonstrated. Combined with hyperspectral methods, multitemporal remote sensing techniques allow us to derive the recent dynamic of these salt pans and link the mineralogical analysis of the pan surface to major physical processes in these dryland environments. In this study we focus on the analyses of the Namibian Omongwa salt pans based on satellite hyperspectral imagery and multispectral time-series data. First, a change detection analysis is applied using the Iterative

  5. Investigation of publication bias in meta-analyses of diagnostic test accuracy: a meta-epidemiological study

    NARCIS (Netherlands)

    van Enst, W. Annefloor; Ochodo, Eleanor; Scholten, Rob J. P. M.; Hooft, Lotty; Leeflang, Mariska M.

    2014-01-01

    The validity of a meta-analysis can be understood better in light of the possible impact of publication bias. The majority of the methods to investigate publication bias in terms of small study-effects are developed for meta-analyses of intervention studies, leaving authors of diagnostic test

  6. Functional Analysis in Public Schools: A Summary of 90 Functional Analyses

    Science.gov (United States)

    Mueller, Michael M.; Nkosi, Ajamu; Hine, Jeffrey F.

    2011-01-01

    Several review and epidemiological studies have been conducted over recent years to inform behavior analysts of functional analysis outcomes. None to date have closely examined demographic and clinical data for functional analyses conducted exclusively in public school settings. The current paper presents a data-based summary of 90 functional…

  7. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    Science.gov (United States)

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  8. VIPRE modeling of VVER-1000 reactor core for DNB analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Y.; Nguyen, Q. [Westinghouse Electric Corporation, Pittsburgh, PA (United States); Cizek, J. [Nuclear Research Institute, Prague, (Czech Republic)

    1995-09-01

    Based on the one-pass modeling approach, the hot channels and the VVER-1000 reactor core can be modeled in 30 channels for DNB analyses using the VIPRE-01/MOD02 (VIPRE) code (VIPRE is owned by Electric Power Research Institute, Palo Alto, California). The VIPRE one-pass model does not compromise any accuracy in the hot channel local fluid conditions. Extensive qualifications include sensitivity studies of radial noding and crossflow parameters and comparisons with the results from THINC and CALOPEA subchannel codes. The qualifications confirm that the VIPRE code with the Westinghouse modeling method provides good computational performance and accuracy for VVER-1000 DNB analyses.

  9. Material analyses of foam-based SiC FCI after dynamic testing in PbLi in MaPLE loop at UCLA

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Maria, E-mail: maria.gonzalez@ciemat.es [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Rapisarda, David; Ibarra, Angel [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Courtessole, Cyril; Smolentsev, Sergey; Abdou, Mohamed [Fusion Science and Technology Center, UCLA (United States)

    2016-11-01

    Highlights: • Samples from foam-based SiC FCI were analyzed by looking at their SEM microstructure and elemental composition. • After finishing dynamic experiments in the flowing hot PbLi, the liquid metal ingress has been confirmed due to infiltration through local defects in the protective inner CVD layer. • No direct evidences of corrosion/erosion were observed; these defects could be related to the manufacturing process. - Abstract: Foam-based SiC flow channel inserts (FCIs) developed and manufactured by Ultramet, USA are currently under testing in the flowing hot lead-lithium (PbLi) alloy in the MaPLE loop at UCLA to address chemical/physical compatibility and to access the MHD pressure drop reduction. UCLA has finished the first experimental series, where a single uninterrupted long-term (∼6500 h) test was performed on a 30-cm FCI segment in a magnetic field up to 1.8 T at the temperature of 300 °C and maximum flow velocities of ∼ 15 cm/s. After finishing the experiments, the FCI sample was extracted from the host stainless steel duct and cut into slices. Few of them have been analyzed at CIEMAT as a part of the joint collaborative effort on the development of the DCLL blanket concept in the EU and the US. The initial inspection of the slices using optical microscopic analysis at UCLA showed significant PbLi ingress into the bulk FCI material that resulted in degradation of insulating properties of the FCI. Current material analyses at CIEMAT are based on advanced techniques, including characterization of FCI samples by FESEM to study PbLi ingress, imaging of cross sections, composition analysis by EDX and crack inspection. These analyses suggest that the ingress was caused by local defects in the protective inner CVD layer that might be originally present in the FCI or occurred during testing.

  10. Non-localization and localization ROC analyses using clinically based scoring

    Science.gov (United States)

    Paquerault, Sophie; Samuelson, Frank W.; Myers, Kyle J.; Smith, Robert C.

    2009-02-01

    We are investigating the potential for differences in study conclusions when assessing the estimated impact of a computer-aided detection (CAD) system on readers' performance. The data utilized in this investigation were derived from a multi-reader multi-case observer study involving one hundred mammographic background images to which fixed-size and fixed-intensity Gaussian signals were added, generating a low- and high-intensity signal sets. The study setting allowed CAD assessment in two situations: when CAD sensitivity was 1) superior or 2) lower than the average reader. Seven readers were asked to review each set in the unaided and CAD-aided reading modes, mark and rate their findings. Using this data, we studied the effect on study conclusion of three clinically-based receiver operating characteristic (ROC) scoring definitions. These scoring definitions included both location-specific and non-location-specific rules. The results showed agreement in the estimated impact of CAD on the overall reader performance. In the study setting where CAD sensitivity is superior to the average reader, the mean difference in AUC between the CAD-aided read and unaided read was 0.049 (95%CIs: -0.027; 0.130) for the image scoring definition that is based on non-location-specific rules, and 0.104 (95%CIs: 0.036; 0.174) and 0.090 (95%CIs: 0.031; 0.155) for image scoring definitions that are based on location-specific rules. The increases in AUC were statistically significant for the location-specific scoring definitions. It was further observed that the variance on these estimates was reduced when using the location-specific scoring definitions compared to that using a non-location-specific scoring definition. In the study setting where CAD sensitivity is equivalent or lower than the average reader, the mean differences in AUC are slightly above 0.01 for all image scoring definitions. These increases in AUC were not statistical significant for any of the image scoring definitions

  11. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0

    Directory of Open Access Journals (Sweden)

    Jing Cong

    2015-09-01

    Full Text Available To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171.

  12. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    Science.gov (United States)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  13. Comparison of plasma input and reference tissue models for analysing [(11)C]flumazenil studies

    NARCIS (Netherlands)

    Klumpers, Ursula M. H.; Veltman, Dick J.; Boellaard, Ronald; Comans, Emile F.; Zuketto, Cassandra; Yaqub, Maqsood; Mourik, Jurgen E. M.; Lubberink, Mark; Hoogendijk, Witte J. G.; Lammertsma, Adriaan A.

    2008-01-01

    A single-tissue compartment model with plasma input is the established method for analysing [(11)C]flumazenil ([(11)C]FMZ) studies. However, arterial cannulation and measurement of metabolites are time-consuming. Therefore, a reference tissue approach is appealing, but this approach has not been

  14. The occurrence of Toxocara malaysiensis in cats in China, confirmed by sequence-based analyses of ribosomal DNA.

    Science.gov (United States)

    Li, Ming-Wei; Zhu, Xing-Quan; Gasser, Robin B; Lin, Rui-Qing; Sani, Rehana A; Lun, Zhao-Rong; Jacobs, Dennis E

    2006-10-01

    Non-isotopic polymerase chain reaction (PCR)-based single-strand conformation polymorphism and sequence analyses of the second internal transcribed spacer (ITS-2) of nuclear ribosomal DNA (rDNA) were utilized to genetically characterise ascaridoids from dogs and cats from China by comparison with those from other countries. The study showed that Toxocara canis, Toxocara cati, and Toxascaris leonina from China were genetically the same as those from other geographical origins. Specimens from cats from Guangzhou, China, which were morphologically consistent with Toxocara malaysiensis, were the same genetically as those from Malaysia, with the exception of a polymorphism in the ITS-2 but no unequivocal sequence difference. This is the first report of T. malaysiensis in cats outside of Malaysia (from where it was originally described), supporting the proposal that this species has a broader geographical distribution. The molecular approach employed provides a powerful tool for elucidating the biology, epidemiology, and zoonotic significance of T. malaysiensis.

  15. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    Science.gov (United States)

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  16. Understanding ageing in older Australians: The contribution of the Dynamic Analyses to Optimise Ageing (DYNOPTA) project to the evidenced base and policy

    Science.gov (United States)

    Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D

    2014-01-01

    Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767

  17. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  18. Relationships between Mathematics Teacher Preparation and Graduates' Analyses of Classroom Teaching

    Science.gov (United States)

    Hiebert, James; Berk, Dawn; Miller, Emily

    2017-01-01

    The purpose of this longitudinal study was to investigate the relationships between mathematics teacher preparation and graduates' analyses of classroom teaching. Fifty-three graduates from an elementary teacher preparation program completed 4 video-based, analysis-of-teaching tasks in the semester before graduation and then in each of the 3…

  19. A Study of Spectral Integration and Normalization in NMR-based Metabonomic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Lowry, David F.; Jarman, Kristin H.; Harbo, Sam J.; Meng, Quanxin; Fuciarelli, Alfred F.; Pounds, Joel G.; Lee, Monica T.

    2005-09-15

    Metabonomics involves the quantitation of the dynamic multivariate metabolic response of an organism to a pathological event or genetic modification (Nicholson, Lindon and Holmes, 1999). The analysis of these data involves the use of appropriate multivariate statistical methods. Exploratory Data Analysis (EDA) linear projection methods, primarily Principal Component Analysis (PCA), have been documented as a valuable pattern recognition technique for 1H NMR spectral data (Brindle et al., 2002, Potts et al., 2001, Robertson et al., 2000, Robosky et al., 2002). Prior to PCA the raw data is typically processed through four steps; (1) baseline correction, (2) endogenous peak removal, (3) integration over spectral regions to reduce the number of variables, and (4) normalization. The effect of the size of spectral integration regions and normalization has not been well studied. We assess the variability structure and classification accuracy on two distinctly different datasets via PCA and a leave-one-out cross-validation approach under two normalization approaches and an array of spectral integration regions. This study indicates that independent of the normalization method the classification accuracy achieved from metabonomic studies is not highly sensitive to the size of the spectral integration region. Additionally, both datasets scaled to mean zero and unity variance (auto-scaled) has higher variability within classification accuracy over spectral integration window widths than data scaled to the total intensity of the spectrum.

  20. Implementing evidence-based medicine in general practice: a focus group based study

    Directory of Open Access Journals (Sweden)

    Aertgeerts Bert

    2005-09-01

    Full Text Available Abstract Background Over the past years concerns are rising about the use of Evidence-Based Medicine (EBM in health care. The calls for an increase in the practice of EBM, seem to be obstructed by many barriers preventing the implementation of evidence-based thinking and acting in general practice. This study aims to explore the barriers of Flemish GPs (General Practitioners to the implementation of EBM in routine clinical work and to identify possible strategies for integrating EBM in daily work. Methods We used a qualitative research strategy to gather and analyse data. We organised focus groups between September 2002 and April 2003. The focus group data were analysed using a combined strategy of 'between-case' analysis and 'grounded theory approach'. Thirty-one general practitioners participated in four focus groups. Purposeful sampling was used to recruit participants. Results A basic classification model documents the influencing factors and actors on a micro-, meso- as well as macro-level. Patients, colleagues, competences, logistics and time were identified on the micro-level (the GPs' individual practice, commercial and consumer organisations on the meso-level (institutions, organisations and health care policy, media and specific characteristics of evidence on the macro-level (policy level and international scientific community. Existing barriers and possible strategies to overcome these barriers were described. Conclusion In order to implement EBM in routine general practice, an integrated approach on different levels needs to be developed.

  1. Neutronic analyses and tools development efforts in the European DEMO programme

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, U., E-mail: ulrich.fischer@kit.edu [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Bachmann, C. [European Fusion Development Agreement (EFDA), Garching (Germany); Bienkowska, B. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Catalan, J.P. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Drozdowicz, K.; Dworak, D. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Leichtle, D. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Fusion for Energy (F4E), Barcelona (Spain); Lengar, I. [MESCS-JSI, Ljubljana (Slovenia); Jaboulay, J.-C. [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Lu, L. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Moro, F. [Associazione ENEA-Euratom, ENEA Fusion Division, Frascati (Italy); Mota, F. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Sanz, J. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Szieberth, M. [Budapest University of Technology and Economics (BME), Budapest (Hungary); Palermo, I. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Pampin, R. [Fusion for Energy (F4E), Barcelona (Spain); Porton, M. [Euratom/CCFE Fusion Association, Culham Science Centre for Fusion Energy (CCFE), Culham (United Kingdom); Pereslavtsev, P. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Ogando, F. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Rovni, I. [Budapest University of Technology and Economics (BME), Budapest (Hungary); and others

    2014-10-15

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools.

  2. Neutronic analyses and tools development efforts in the European DEMO programme

    International Nuclear Information System (INIS)

    Fischer, U.; Bachmann, C.; Bienkowska, B.; Catalan, J.P.; Drozdowicz, K.; Dworak, D.; Leichtle, D.; Lengar, I.; Jaboulay, J.-C.; Lu, L.; Moro, F.; Mota, F.; Sanz, J.; Szieberth, M.; Palermo, I.; Pampin, R.; Porton, M.; Pereslavtsev, P.; Ogando, F.; Rovni, I.

    2014-01-01

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools

  3. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.

    2010-01-01

    of long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser...

  4. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Use of results of microbiological analyses for risk-based control of Listeria monocytogenes in marinated broiler legs.

    Science.gov (United States)

    Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura

    2008-02-10

    Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk

  6. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  7. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    Directory of Open Access Journals (Sweden)

    Michael Robert Cunningham

    2016-10-01

    Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.

  8. Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA

    Science.gov (United States)

    Thorndahl, S.; Smith, J. A.; Krajewski, W. F.

    2012-04-01

    During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and

  9. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  10. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    Directory of Open Access Journals (Sweden)

    Zhigang Zuo

    2014-04-01

    Full Text Available It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were discussed. A model with casing diameter of 0.875D40 was selected for further analyses. FEM analyses were then carried out on different combinations of casing sizes, casing wall thickness, and materials, to evaluate its safety related to pressure integrity, with respect to both static and fatigue strength analyses. Two models with forging and cast materials were selected as final results.

  11. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    Science.gov (United States)

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  12. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  13. A Corpus-based Study of EFL Learners’ Errors in IELTS Essay Writing

    OpenAIRE

    Hoda Divsar; Robab Heydari

    2017-01-01

    The present study analyzed different types of errors in the EFL learners’ IELTS essays. In order to determine the major types of errors, a corpus of 70 IELTS examinees’ writings were collected, and their errors were extracted and categorized qualitatively. Errors were categorized based on a researcher-developed error-coding scheme into 13 aspects. Based on the descriptive statistical analyses, the frequency of each error type was calculated and the commonest errors committed by the EFL learne...

  14. Application of insights from the IREP analyses to the IREP procedures guide

    International Nuclear Information System (INIS)

    Carlson, D.D.; Murphy, J.A.; Young, J.

    1982-01-01

    One of the objectives of the Interim Reliability Evaluation Program (IREP) was to prepare a set of procedures based on experience gained in the study for use in future IREP-type analyses. The current analyses used a set of procedures and, over the course of the program, a concerted effort was made to develop insights which could improve these procedures. Insights have been gained into the organization and content of th procedures guide, into the performance and management of an IREP analysis, and into the methods to be used in the analysis

  15. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  16. Chemical analyses of rocks, minerals, and detritus, Yucca Mountain--Preliminary report, special report No. 11

    International Nuclear Information System (INIS)

    Hill, C.A.; Livingston, D.E.

    1993-09-01

    This chemical analysis study is part of the research program of the Yucca Mountain Project intended to provide the State of Nevada with a detailed assessment of the geology and geochemistry of Yucca Mountain and adjacent regions. This report is preliminary in the sense that more chemical analyses may be needed in the future and also in the sense that these chemical analyses should be considered as a small part of a much larger geological data base. The interpretations discussed herein may be modified as that larger data base is examined and established. All of the chemical analyses performed to date are shown in Table 1. There are three parts to this table: (1) trace element analyses on rocks (limestone and tuff) and minerals (calcite/opal), (2) rare earth analyses on rocks (tuff) and minerals (calcite/opal), and (3) major element analyses + CO 2 on rocks (tuff) and detritus sand. In this report, for each of the three parts of the table, the data and its possible significance will be discussed first, then some overall conclusions will be made, and finally some recommendations for future work will be offered

  17. Comparison study of inelastic analyses for high temperature structure subjected to cyclic creep loading

    International Nuclear Information System (INIS)

    Kim, J. B.; Lee, H. Y.; Lee, J. H.

    2002-01-01

    It is necessary to develop a reliable numerical analysis method to simulate the plasticity and creep behavior of LMR high temperature structures. Since general purpose finite element analysis codes such as ABAQUS and ANSYS provide various models for plastic hardening and creep equation of Norton's power law, it is possible to perform the separate iscoplasticity analysis. In this study, the high temperature structural analysis program(NONSTA-VP) implementing Chaboche's unified visco plasticity equation into ABAQUS has been developed and the viscoplastic response of the 316 SS plate having a circular hole subjected to a cyclic creep loading has been analyzed. The results among the separate visco plasticity analyses and the unified visco plasticity analysis using NONSTA-VP have been compared and the results from NONSTA-VP shows remarkable responses of stress relaxation and creep behavior during hold time compared to those from separate visco plasticity analyses. Also, it is anticipated to reduce the conservatism arising from using elastic approach for creep-fatigue damage analysis since the stress range and the strain range from the unified visco plasticity analysis has been greatly reduced compared to those from separate visco plasticity analyses and elastic analysis

  18. Analysing Scientific Collaborations of New Zealand Institutions using Scopus Bibliometric Data

    OpenAIRE

    Aref, Samin; Friggens, David; Hendy, Shaun

    2017-01-01

    Scientific collaborations are among the main enablers of development in small national science systems. Although analysing scientific collaborations is a well-established subject in scientometrics, evaluations of scientific collaborations within a country remain speculative with studies based on a limited number of fields or using data too inadequate to be representative of collaborations at a national level. This study represents a unique view on the collaborative aspect of scientific activi...

  19. STUDIES ON SOIL LIQUEFACTION AND SETTLEMENT IN THE URAYASU DISTRICT USING EFFECTIVE STRESS ANALYSES FOR THE 2011 EAST JAPAN GREAT EARTHQUAKE

    Science.gov (United States)

    Fukutake, Kiyoshi; Jang, Jiho

    The 2011 East Japan Great Earthquake caused soil liquefaction over a wide area. In particular, severe soil liquefaction was reported in the northern parts of the reclaimed lands around Tokyo Bay, even though the seismic intensity in this area was only about 5 on the Japan scale with low acceleration. The authors surveyed the residual settlement in the Urayasu district and then conducted effective stress analyses of areas affected and not affected by liquefaction. The analyses compared with the acceleration waves monitored with K-NET Urayasu or ground settlements surveyed. It is based on the acceleration observed on the seismic bedrocks in earthquake engineering in some other districts adjacent to Urayasu. Much of the settlement was due to the long duration of the earthquake, with further settlement resulting from the aftershock. The study shows that the affects of aftershocks need to be monitored, as well as needs for improvement of simplified liquefaction prediction methods using the factor of safety, FL.

  20. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    Buck, John W.; McDonald, John P.; Taira, Randal Y.

    2002-01-01

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  1. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  2. Long-term uranium supply-demand analyses

    International Nuclear Information System (INIS)

    1986-12-01

    It is the intention of this study to investigate the long-term uranium supply demand situation using a number of supply and demand related assumptions. For supply, these assumptions as used in the Resources and Production Projection (RAPP) model include country economic development status, and consequent lead times for exploration and development, uranium development status, country infrastructure, and uranium resources including the Reasonably Assured (RAR), Estimated Additional, Categories I and II, (EAR-I and II) and Speculative Resource categories. The demand assumptions were based on the ''pure'' reactor strategies developed by the NEA Working Party on Nuclear Fuel Cycle Requirements for the 1986 OECD (NEA)/IAEA reports ''Nuclear Energy and its Fuel Cycle: Prospects to 2025''. In addition for this study, a mixed strategy case was computed using the averages of the Plutonium (Pu) burning LWR high, and the improved LWR low cases. It is understandable that such a long-term analysis cannot present hard facts, but it can show which variables may in fact influence the long-term supply-demand situation. It is hoped that results of this study will provide valuable information for planners in the uranium supply and demand fields. Periodical re-analyses with updated data bases will be needed from time to time

  3. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    Science.gov (United States)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  4. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  5. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Dincer, Ibrahim; Rosen, Marc A.

    2011-01-01

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO 2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO 2 emissions of the overall plant. The environmental impact in terms of CO 2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO 2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  6. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  7. Fuzzy-based HAZOP study for process industry

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Junkeon; Chang, Daejun, E-mail: djchang@kaist.edu

    2016-11-05

    Highlights: • HAZOP is the important technique to evaluate system safety and its risks while process operations. • Fuzzy theory can handle the inherent uncertainties of process systems for the HAZOP. • Fuzzy-based HAZOP considers the aleatory and epistemic uncertainties and provides the risk level with less uncertainty. • Risk acceptance criteria should be considered regarding the transition region for each risk. - Abstract: This study proposed a fuzzy-based HAZOP for analyzing process hazards. Fuzzy theory was used to express uncertain states. This theory was found to be a useful approach to overcome the inherent uncertainty in HAZOP analyses. Fuzzy logic sharply contrasted with classical logic and provided diverse risk values according to its membership degree. Appropriate process parameters and guidewords were selected to describe the frequency and consequence of an accident. Fuzzy modeling calculated risks based on the relationship between the variables of an accident. The modeling was based on the mean expected value, trapezoidal fuzzy number, IF-THEN rules, and the center of gravity method. A cryogenic LNG (liquefied natural gas) testing facility was the objective process for the fuzzy-based and conventional HAZOPs. The most significant index is the frequency to determine risks. The comparison results showed that the fuzzy-based HAZOP provides better sophisticated risks than the conventional HAZOP. The fuzzy risk matrix presents the significance of risks, negligible risks, and necessity of risk reduction.

  8. Neutronics-processing interface analyses for the Accelerator Transmutation of Waste (ATW) aqueous-based blanket system

    International Nuclear Information System (INIS)

    Davidson, J.W.; Battat, M.E.

    1993-01-01

    Neutronics-processing interface parameters have large impacts on the neutron economy and transmutation performance of an aqueous-based Accelerator Transmutation of Waste (ATW) system. A detailed assessment of the interdependence of these blanket neutronic and chemical processing parameters has been performed. Neutronic performance analyses require that neutron transport calculations for the ATW blanket systems be fully coupled with the blanket processing and include all neutron absorptions in candidate waste nuclides as well as in fission and transmutation products. The effects of processing rates, flux levels, flux spectra, and external-to-blanket inventories on blanket neutronic performance were determined. In addition, the inventories and isotopics in the various subsystems were also calculated for various actinide and long-lived fission product transmutation strategies

  9. Studies of base pair sequence effects on DNA solvation based on all

    Indian Academy of Sciences (India)

    Detailed analyses of the sequence-dependent solvation and ion atmosphere of DNA are presented based on molecular dynamics (MD) simulations on all the 136 unique tetranucleotide steps obtained by the ABC consortium using the AMBER suite of programs. Significant sequence effects on solvation and ion localization ...

  10. Design factors analyses of second-loop PRHRS

    Directory of Open Access Journals (Sweden)

    ZHANG Hongyan

    2017-05-01

    Full Text Available In order to study the operating characteristics of a second-loop Passive Residual Heat Removal System (PRHRS, the transient thermal analysis code RELAP5 is used to build simulation models of the main coolant system and second-loop PRHRS. Transient calculations and comparative analyses under station blackout accident and one-side feed water line break accident conditions are conducted for three critical design factors of the second-loop PRHRS:design capacity, emergency makeup tank and isolation valve opening speed. The impacts of the discussed design factors on the operating characteristics of the second-loop PRHRS are summarized based on calculations and analyses. The analysis results indicate that the system safety and cooling rate should be taken into consideration in designing PRHRS's capacity,and water injection from emergency makeup tank to steam generator can provide advantage to system cooling in the event of accident,and system startup performance can be improved by reducing the opening speed of isolation valve. The results can provide references for the design of the second-loop PRHRS in nuclear power plants.

  11. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    Directory of Open Access Journals (Sweden)

    Hoľko Michal

    2014-12-01

    Full Text Available The article deals with numerical analyses of a Continuous Flight Auger (CFA pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed.

  12. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  13. Economic Analyses in Anterior Cruciate Ligament Reconstruction: A Qualitative and Systematic Review.

    Science.gov (United States)

    Saltzman, Bryan M; Cvetanovich, Gregory L; Nwachukwu, Benedict U; Mall, Nathan A; Bush-Joseph, Charles A; Bach, Bernard R

    2016-05-01

    As the health care system in the United States (US) transitions toward value-based care, there is an increased emphasis on understanding the cost drivers and high-value procedures within orthopaedics. To date, there has been no systematic review of the economic literature on anterior cruciate ligament reconstruction (ACLR). To evaluate the overall evidence base for economic studies published on ACLR in the orthopaedic literature. Data available on the economics of ACLR are summarized and cost drivers associated with the procedure are identified. Systematic review. All economic studies (including US-based and non-US-based) published between inception of the MEDLINE database and October 3, 2014, were identified. Given the heterogeneity of the existing evidence base, a qualitative, descriptive approach was used to assess the collective results from the economic studies on ACLR. When applicable, comparisons were made for the following cost-related variables associated with the procedure for economic implications: outpatient versus inpatient surgery (or outpatient vs overnight hospital stay vs >1-night stay); bone-patellar tendon-bone (BPTB) graft versus hamstring (HS) graft source; autograft versus allograft source; staged unilateral ACLR versus bilateral ACLR in a single setting; single- versus double-bundle technique; ACLR versus nonoperative treatment; and other unique comparisons reported in single studies, including computer-assisted navigation surgery (CANS) versus traditional surgery, early versus delayed ACLR, single- versus double-incision technique, and finally the costs of ACLR without comparison of variables. A total of 24 studies were identified and included; of these, 17 included studies were cost identification studies. The remaining 7 studies were cost utility analyses that used economic models to investigate the effect of variables such as the cost of allograft tissue, fixation devices, and physical therapy, the percentage and timing of revision

  14. Study of oxidation behaviour of Zr-based bulk amorphous alloy Zr 65 ...

    Indian Academy of Sciences (India)

    The oxidation behaviour of Zr-based bulk amorphous alloy Zr65Cu17.5Ni10Al7.5 has been studied in air environment at various temperatures in the temperature range 591–684 K using a thermogravimetric analyser (TGA). The oxidation kinetics of the alloy in the amorphous phase obeys the parabolic rate law for oxidation ...

  15. Linkage and related analyses of Barrett's esophagus and its associated adenocarcinomas.

    Science.gov (United States)

    Sun, Xiangqing; Elston, Robert; Falk, Gary W; Grady, William M; Faulx, Ashley; Mittal, Sumeet K; Canto, Marcia I; Shaheen, Nicholas J; Wang, Jean S; Iyer, Prasad G; Abrams, Julian A; Willis, Joseph E; Guda, Kishore; Markowitz, Sanford; Barnholtz-Sloan, Jill S; Chandar, Apoorva; Brock, Wendy; Chak, Amitabh

    2016-07-01

    Familial aggregation and segregation analysis studies have provided evidence of a genetic basis for esophageal adenocarcinoma (EAC) and its premalignant precursor, Barrett's esophagus (BE). We aim to demonstrate the utility of linkage analysis to identify the genomic regions that might contain the genetic variants that predispose individuals to this complex trait (BE and EAC). We genotyped 144 individuals in 42 multiplex pedigrees chosen from 1000 singly ascertained BE/EAC pedigrees, and performed both model-based and model-free linkage analyses, using S.A.G.E. and other software. Segregation models were fitted, from the data on both the 42 pedigrees and the 1000 pedigrees, to determine parameters for performing model-based linkage analysis. Model-based and model-free linkage analyses were conducted in two sets of pedigrees: the 42 pedigrees and a subset of 18 pedigrees with female affected members that are expected to be more genetically homogeneous. Genome-wide associations were also tested in these families. Linkage analyses on the 42 pedigrees identified several regions consistently suggestive of linkage by different linkage analysis methods on chromosomes 2q31, 12q23, and 4p14. A linkage on 15q26 is the only consistent linkage region identified in the 18 female-affected pedigrees, in which the linkage signal is higher than in the 42 pedigrees. Other tentative linkage signals are also reported. Our linkage study of BE/EAC pedigrees identified linkage regions on chromosomes 2, 4, 12, and 15, with some reported associations located within our linkage peaks. Our linkage results can help prioritize association tests to delineate the genetic determinants underlying susceptibility to BE and EAC.

  16. Correcting for catchment area nonresidency in studies based on tumor-registry data

    International Nuclear Information System (INIS)

    Sposto, R.; Preston, D.L.

    1993-05-01

    We discuss the effect of catchment area nonresidency on estimates of cancer incidence from a tumor-registry-based cohort study and demonstrate that a relatively simple correction is possible in the context of Poisson regression analysis if individual residency histories or the probabilities of residency are known. A comparison of a complete data maximum likelihood analysis with several Poisson regression analyses demonstrates the adequacy of the simple correction in a large simulated data set. We compare analyses of stomach-cancer incidence from the Radiation Effects Research Foundation tumor registry with and without the correction. We also discuss some implications of including cases identified only on the basis of death certificates. (author)

  17. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  18. XRF analyses for the study of painting technique and degradation on frescoes by Beato Angelico: first results

    International Nuclear Information System (INIS)

    Mazzinghi, A.

    2014-01-01

    Beato Angelico is one of the most important Italian painters of the Renaissance period, in particular he was a master of the so-called 'Buon fresco' technique for mural paintings. A wide diagnostic campaign with X-Ray Fluorescence (XRF) analyses has been carried out on three masterworks painted by Beato Angelico in the San Marco monastery in Florence: the Crocifissione con Santi, the 'Annunciazione' and the 'Madonna delle Ombre'. The latter is painted by mixing fresco and secco techniques, which makes it of particular interest for the study of two different paintings techniques of the same artist. Then the aim of the study was focused on the characterization of the painting palette, and therefore the painting techniques, used by Beato Angelico. Moreover, the conservators were interested in the study of degradation processes and old restoration treatments. Our analyses have been carried out by means of the XRF spectrometer developed at LABEC laboratory at Istituto Nazionale di Fisica Nucleare in Florence (Italy). XRF is indeed especially suited for such a kind of study, allowing for multi-elemental, nondestructive, non-invasive analyses in a short time, with portable instruments. In this paper the first results concerning the XRF analysis are presented.

  19. Comprehensive review of genetic association studies and meta-analyses on miRNA polymorphisms and cancer risk.

    Directory of Open Access Journals (Sweden)

    Kshitij Srivastava

    Full Text Available MicroRNAs (miRNAs are small RNA molecules that regulate the expression of corresponding messenger RNAs (mRNAs. Variations in the level of expression of distinct miRNAs have been observed in the genesis, progression and prognosis of multiple human malignancies. The present study was aimed to investigate the association between four highly studied miRNA polymorphisms (mir-146a rs2910164, mir-196a2 rs11614913, mir-149 rs2292832 and mir-499 rs3746444 and cancer risk by using a two-sided meta-analytic approach.An updated meta-analysis based on 53 independent case-control studies consisting of 27573 cancer cases and 34791 controls was performed. Odds ratio (OR and 95% confidence interval (95% CI were used to investigate the strength of the association.Overall, the pooled analysis showed that mir-196a2 rs11614913 was associated with a decreased cancer risk (OR = 0.846, P = 0.004, TT vs. CC while other miRNA SNPs showed no association with overall cancer risk. Subgroup analyses based on type of cancer and ethnicity were also performed, and results indicated that there was a strong association between miR-146a rs2910164 and overall cancer risk in Caucasian population under recessive model (OR = 1.274, 95%CI = 1.096-1.481, P = 0.002. Stratified analysis by cancer type also associated mir-196a2 rs11614913 with lung and colorectal cancer at allelic and genotypic level.The present meta-analysis suggests an important role of mir-196a2 rs11614913 polymorphism with overall cancer risk especially in Asian population. Further studies with large sample size are needed to evaluate and confirm this association.

  20. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  1. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 3

    International Nuclear Information System (INIS)

    1983-01-01

    Critical review of the analyses of the German Risk Assessment Study on Nuclear Power Plants (DRS) concerning the reliability of the containment under accident conditions and the conditions of fission product release (transport and distribution in the environment). Main point of interest in this context is an explosion in the steam section and its impact on the containment. Critical comments are given on the models used in the DRS for determining the accident consequences. The analyses made deal with the mathematical models and database for propagation calculations, the methods of dose computation and assessment of health hazards, and the modelling of protective and safety measures. Social impacts of reactor accidents are also considered. (RF) [de

  2. Phylogenetic analyses of Vitis (Vitaceae) based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids.

    Science.gov (United States)

    Jansen, Robert K; Kaittanis, Charalambos; Saski, Christopher; Lee, Seung-Bum; Tomkins, Jeffrey; Alverson, Andrew J; Daniell, Henry

    2006-04-09

    The Vitaceae (grape) is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade. However, maximum likelihood analyses place

  3. Phylogenetic analyses of Vitis (Vitaceae based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids

    Directory of Open Access Journals (Sweden)

    Alverson Andrew J

    2006-04-01

    Full Text Available Abstract Background The Vitaceae (grape is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. Results The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade

  4. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2011-01-01

    Full Text Available Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong are presented.

  5. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    Science.gov (United States)

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  6. Studies of CTNNBL1 and FDFT1 variants and measures of obesity: analyses of quantitative traits and case-control studies in 18,014 Danes

    DEFF Research Database (Denmark)

    Andreasen, Camilla Helene; Mogensen, Mette Sloth; Borch-Johnsen, Knut

    2009-01-01

    of obesity-related quantitative traits, and case-control studies in large study samples of Danes. METHODS: The FDFT1 rs7001819, CTNNBL1 rs6013029 and rs6020846 were genotyped, using TaqMan allelic discrimination, in a combined study sample comprising 18,014 participants ascertained from; the population...... and a previous study. FDFT1 rs7001819 showed no association with obesity, neither when analysing quantitative traits nor when performing case-control studies of obesity.......). The most significantly associating variants within CTNNBL1 including rs6013029 and rs6020846 were additionally confirmed to associate with morbid obesity in a French Caucasian case-control sample. The aim of this study was to investigate the impact of these three variants on obesity, through analyses...

  7. Combining Conversation Analysis and Nexus Analysis to analyse sociomaterial and affective practices

    DEFF Research Database (Denmark)

    Raudaskoski, Pirkko Liisa

    2016-01-01

    of resemiotization (Iedema 2000). Within organization and design studies, materiality has become a focus in the increasingly popular sociomaterial approach to everyday practices (e.g. Orlikowski 2007). Some sociomaterial scholars (e.g. Sørensen 2013) analyse ethnographic data either as evidence for the sociomaterial....... The analytical effort is to get to the senses and sensations which are regarded as opposite of sense-making. In my presentation, I go through some of my own analyses from various institutional interactions to show how CA-based multimodal analyses of local interactional (or intra-actional) trajectories combined...... configuration, also as an interdisciplinary offer for an analytic package that might help sociomaterial researchers of practices come even closer to the situation at hand as an assemblage out of which materials, humans and experiences emerge....

  8. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Science.gov (United States)

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  9. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Directory of Open Access Journals (Sweden)

    Arika Ligmann-Zielinska

    Full Text Available Agent-based models (ABMs have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1 efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2 conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  10. Preparation Characterization and Antibacterial Studies of Chelates of Schiff Base Derived from4-Aminoantipyrine, Furfural and o-phenylenediamine

    Directory of Open Access Journals (Sweden)

    M. S. Suresh

    2011-01-01

    Full Text Available A new series of transition metal complexes of Mn(II, Co(II, Ni(II, Cu(II and Zn(II were synthesized from the Schiff base ligand derived from 4-aminoantipyrine, furfural and o-phenylenediamine. The structural features were derived from their elemental analyses, infrared, UV-visible spectroscopy, NMR spectroscopy, thermal gravimetric analyses, ESR spectral analyses and conductivity measurements. The data of the complexes suggested square planar geometry for the metals with primary valency two. Antimicrobial screening tests were performed against bacteria. The comparative study of the MIC values of the Schiff base and its metal complexes indicate that the metal complexes exhibit greater antibacterial activity than the free ligand.

  11. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  12. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  13. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  14. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  15. Energy, exergy and sustainability analyses of hybrid renewable energy based hydrogen and electricity production and storage systems: Modeling and case study

    International Nuclear Information System (INIS)

    Caliskan, Hakan; Dincer, Ibrahim; Hepbasli, Arif

    2013-01-01

    In this study, hybrid renewable energy based hydrogen and electricity production and storage systems are conceptually modeled and analyzed in detail through energy, exergy and sustainability approaches. Several subsystems, namely hybrid geothermal energy-wind turbine-solar photovoltaic (PV) panel, inverter, electrolyzer, hydrogen storage system, Proton Exchange Membrane Fuel Cell (PEMFC), battery and loading system are considered. Also, a case study, based on hybrid wind–solar renewable energy system, is conducted and its results are presented. In addition, the dead state temperatures are considered as 0 °C, 10 °C, 20 °C and 30 °C, while the environment temperature is 30 °C. The maximum efficiencies of the wind turbine, solar PV panel, electrolyzer, PEMFC are calculated as 26.15%, 9.06%, 53.55%, and 33.06% through energy analysis, and 71.70%, 9.74%, 53.60%, and 33.02% through exergy analysis, respectively. Also, the overall exergy efficiency, ranging from 5.838% to 5.865%, is directly proportional to the dead state temperature and becomes higher than the corresponding energy efficiency of 3.44% for the entire system. -- Highlights: ► Developing a three-hybrid renewable energy (geothermal–wind–solar)-based system. ► Undertaking a parametric study at various dead state temperatures. ► Investigating the effect of dead state temperatures on exergy efficiency

  16. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  17. Ultrastructure of spermatozoa of spider crabs, family Mithracidae (Crustacea, Decapoda, Brachyura): Integrative analyses based on morphological and molecular data.

    Science.gov (United States)

    Assugeni, Camila de O; Magalhães, Tatiana; Bolaños, Juan A; Tudge, Christopher C; Mantelatto, Fernando L; Zara, Fernando J

    2017-12-01

    Recent studies based on morphological and molecular data provide a new perspective concerning taxonomic aspects of the brachyuran family Mithracidae. These studies proposed a series of nominal changes and indicated that the family is actually represented by a different number and representatives of genera than previously thought. Here, we provide a comparative description of the ultrastructure of spermatozoa and spermatophores of some species of Mithracidae in a phylogenetic context. The ultrastructure of the spermatozoa and spermatophore was observed by scanning and transmission electron microscopy. The most informative morphological characters analysed were thickness of the operculum, shape of the perforatorial chamber and shape and thickness of the inner acrosomal zone. As a framework, we used a topology based on a phylogenetic analysis using mitochondrial data obtained here and from previous studies. Our results indicate that closely related species share a series of morphological characteristics of the spermatozoa. A thick operculum, for example, is a feature observed in species of the genera Amphithrax, Teleophrys, and Omalacantha in contrast to the slender operculum observed in Mithraculus and Mithrax. Amphithrax and Teleophrys have a rhomboid perforatorial chamber, while Mithraculus, Mithrax, and Omalacantha show a wider, deltoid morphology. Furthermore, our results are in agreement with recently proposed taxonomic changes including the separation of the genera Mithrax (previously Damithrax), Amphithrax (previously Mithrax) and Mithraculus, and the synonymy of Mithrax caribbaeus with Mithrax hispidus. Overall, the spermiotaxonomy of these species of Mithracidae represent a novel set of data that corroborates the most recent taxonomic revision of the family and can be used in future taxonomic and phylogenetic studies within this family. © 2017 Wiley Periodicals, Inc.

  18. Study on the atmospheric component with the scope of analyses on the environmental impact

    International Nuclear Information System (INIS)

    Ferrara, V.; La Camera, F.

    1989-03-01

    This work has been carried out following a specific request from Italian National Department for Environment and shows technical approaches and methodologies of analyses and forecasts set up for environmental impact studies referred to 'atmospheric environment'. This work is presented according to the general items and objectives fixed by the same Department in the wider operative system for the application in Italy of environmental impact procedures. (author)

  19. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  20. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  1. Energy and exergy analyses of an integrated solar heat pump system

    International Nuclear Information System (INIS)

    Suleman, F.; Dincer, I.; Agelin-Chaab, M.

    2014-01-01

    An integrated solar and heat pump based system for industrial heating is developed in this study. The system comprises heat pump cycle for process heating water and solar energy for another industrial heating process. Comprehensive energy and exergy analyses are performed on the system. These analyses generated some compelling results as expected because of the use of green and environmentally friendly energy sources. The results show that the energy efficiency of the process is 58% while the exergy efficiency is 75%. Energetic COP of the heat pump cycle is 3.54 whereas the exergy efficiency is 42.5%. Moreover, the energetic COP of the system is 2.97 and the exergy efficiency of the system is 35.7%. In the parametric study, a different variation such as changing the temperature and pressure of the condenser also shows positive results. - Highlights: • An integrated system is analysed using renewable energy source which can be used in textile industry. • Energy losses and exergy destructions are calculated at all major components. • Energy and exergy efficiencies of all subunits, subsystems and overall system are determined. • A parametric study shows the effect of environment and operating conditions on efficiencies. • Solar energy for heating in textile industry is efficient and environmentally friendly

  2. Increased migraine risk in osteoporosis patients: a nationwide population-based study

    OpenAIRE

    Wu, Chieh-Hsin; Zhang, Zi-Hao; Wu, Ming-Kung; Wang, Chiu-Huan; Lu, Ying-Yi; Lin, Chih-Lung

    2016-01-01

    Background Osteoporosis and migraine are both important public health problems and may have overlapping pathophysiological mechanisms. The aim of this study was to use a Taiwanese population-based dataset to assess migraine risk in osteoporosis patients. Methods The Taiwan National Health Insurance Research Database was used to analyse data for 40,672 patients aged ?20?years who had been diagnosed with osteoporosis during 1996?2010. An additional 40,672 age-matched patients without osteoporos...

  3. Evaluation of an optoacoustic based gas analysing device

    Science.gov (United States)

    Markmann, Janine; Lange, Birgit; Theisen-Kunde, Dirk; Danicke, Veit; Mayorov, Fedor; Eckert, Sebastian; Kettmann, Pascal; Brinkmann, Ralf

    2017-07-01

    The relative occurrence of volatile organic compounds in the human respiratory gas is disease-specific (ppb range). A prototype of a gas analysing device using two tuneable laser systems, an OPO-laser (2.5 to 10 μm) and a CO2-laser (9 to 11 μm), and an optoacoustic measurement cell was developed to detect concentrations in the ppb range. The sensitivity and resolution of the system was determined by test gas measurements, measuring ethylene and sulfur hexafluoride with the CO2-laser and butane with the OPO-laser. System sensitivity found to be 13 ppb for sulfur hexafluoride, 17 ppb for ethylene and Respiratory gas samples of 8 healthy volunteers were investigated by irradiation with 17 laser lines of the CO2-laser. Several of those lines overlap with strong absorption bands of ammonia. As it is known that ammonia concentration increases by age a separation of people 35 was striven for. To evaluate the data the first seven gas samples were used to train a discriminant analysis algorithm. The eighth subject was then assigned correctly to the group >35 years with the age of 49 years.

  4. Stress analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1990-02-01

    The International Thermonuclear Experimental Reactor (ITER) is intended as an experimental thermonuclear tokamak reactor for testing the basic physics, performance and technologies essential to future fusion reactors. The ITER design will be based on extensive new design work, supported by new physical and technological results, and on the great body of experience built up over several years from previous national and international reactor studies. Conversely, the ITER design process should provide the fusion community with valuable insights into what key areas need further development or clarification as we move forward towards practical fusion power. As part of the design process of the ITER toroidal field coils the mechanical behaviour of the magnetic system under fault conditions has to be analysed in more detail. This paper describes the work carried out to create a detailed finite element model of two toroidal field coils as well as some results of linear elastic analyses with fault conditions. The analyses have been performed with the finite element code ANSYS. (author). 5 refs.; 8 figs.; 2 tabs

  5. Surrogacy of progression free survival for overall survival in metastatic breast cancer studies: Meta-analyses of published studies.

    Science.gov (United States)

    Kundu, Madan G; Acharyya, Suddhasatta

    2017-02-01

    PFS is often used as a surrogate endpoint for OS in metastatic breast cancer studies. We have evaluated the association of treatment effect on PFS with significant HR OS (and how this association is affected by other factors) in published prospective metastatic breast cancer studies. A systematic literature search in PubMed identified prospective metastatic breast cancer studies. Treatment effects on PFS were determined using hazard ratio (HR PFS ), increase in median PFS (ΔMED PFS ) and % increase in median PFS (%ΔMED PFS ). Diagnostic accuracy of PFS measures (HR PFS , ΔMED PFS and %ΔMED PFS ) in predicting significant HR OS was assessed using receiver operating characteristic (ROC) curves and classification tree approach (CART). Seventy-four cases (i.e., treatment to control comparisons) from 65 individual publications were identified for the analyses. Of these, 16 cases reported significant treatment effect on HR OS at 5% level of significance. Median number of deaths reported in these cases were 153. Area under the ROC curve (AUC) for diagnostic measures as HR PFS , ΔMED PFS and %ΔMED PFS were 0.69, 0.70 and 0.75, respectively. Classification tree results identified %ΔMED PFS and number of deaths as diagnostic measure for significant HR OS . Only 7.9% (3/39) cases with ΔMED PFS shorter than 48.27% reported significant HR OS . There were 7 cases with ΔMED PFS of 48.27% or more and number of deaths reported as 227 or more - of these 5 cases reported significant HR OS . %ΔMED PFS was found to be a better diagnostic measure for predicting significant HR OS . Our analysis results also suggest that consideration of total number of deaths may further improve its diagnostic performance. Based on our study results, the studies with 50% improvement in median PFS are more likely to produce significant HR OS if the total number of OS events at the time of analysis is 227 or more. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A deformation-based morphometry study of patients with early-stage Parkinson's disease

    DEFF Research Database (Denmark)

    Borghammer, P; Østergaard, Karen; Cumming, P

    2010-01-01

    BACKGROUND AND PURPOSE: Previous volumetric magnetic resonance imaging (MRI) studies of Parkinson's disease (PD) utilized primarily voxel-based morphometry (VBM), and investigated mostly patients with moderate- to late-stage disease. We now use deformation-based morphometry (DBM), a method...... purported to be more sensitive than VBM, to test for atrophy in patients with early-stage PD. METHODS: T1-weighted MRI images from 24 early-stage PD patients and 26 age-matched normal control subjects were compared using DBM. Two separate studies were conducted, where two minimally-biased nonlinear...... intensity-average were created; one for all subjects and another for just the PD patients. The DBM technique creates an average population-based MRI-average in an iterative hierarchical fashion. The nonlinear transformations estimated to match each subject to the MRI-average were then analysed. RESULTS...

  7. Improved phylogenetic analyses corroborate a plausible position of Martialis heureka in the ant tree of life.

    Directory of Open Access Journals (Sweden)

    Patrick Kück

    Full Text Available Martialinae are pale, eyeless and probably hypogaeic predatory ants. Morphological character sets suggest a close relationship to the ant subfamily Leptanillinae. Recent analyses based on molecular sequence data suggest that Martialinae are the sister group to all extant ants. However, by comparing molecular studies and different reconstruction methods, the position of Martialinae remains ambiguous. While this sister group relationship was well supported by Bayesian partitioned analyses, Maximum Likelihood approaches could not unequivocally resolve the position of Martialinae. By re-analysing a previous published molecular data set, we show that the Maximum Likelihood approach is highly appropriate to resolve deep ant relationships, especially between Leptanillinae, Martialinae and the remaining ant subfamilies. Based on improved alignments, alignment masking, and tree reconstructions with a sufficient number of bootstrap replicates, our results strongly reject a placement of Martialinae at the first split within the ant tree of life. Instead, we suggest that Leptanillinae are a sister group to all other extant ant subfamilies, whereas Martialinae branch off as a second lineage. This assumption is backed by approximately unbiased (AU tests, additional Bayesian analyses and split networks. Our results demonstrate clear effects of improved alignment approaches, alignment masking and data partitioning. We hope that our study illustrates the importance of thorough, comprehensible phylogenetic analyses using the example of ant relationships.

  8. Barriers to guideline-compliant psoriasis care: analyses and concepts.

    Science.gov (United States)

    Eissing, L; Radtke, M A; Zander, N; Augustin, M

    2016-04-01

    Despite the availability of effective therapeutics and evidence-based treatment guidelines, a substantial proportion of patients with moderate-to-severe psoriasis does not receive appropriate care. This under-provision of health care may cause further worsening of health, remarkable limitations of the patient's quality of life, and indirect costs for the health care system. In order to provide guideline-compliant care for every psoriasis patient, it is important to identify barriers obstructing optimal care. Studies have identified various barriers on the physician's and on the patient's side; however, respective studies approached only single barriers, and not all of them in the context of psoriasis. Other publications that describe barriers systematically did not focus on psoriasis either. The objective of this literature review was to identify barriers and facilitators, based on studies analysing quality of care and single barriers, resulting in a comprehensive model of causal factors. Our analyses revealed three categories of barriers - patient-related, physician-related and external factors: On the patient side, we found non-adherence to therapies to be an important barrier, often in close association with psychiatric factors. Barriers on the physician's side predominantly are incomplete knowledge of the guidelines as well as the complexity of psoriasis comorbidity. In some countries, payment for patients with complex disease status is poor and inconsistent reimbursement regulations potentially interfere with optimal care. The current analysis indicates that most barriers are interdependent. Thus, measures approaching related barriers simultaneously are required. To improve care for psoriasis patients, further studies systematically addressing all potentially relevant barriers in conjoint are needed. © 2015 European Academy of Dermatology and Venereology.

  9. Analysing Trust Transitivity and The Effects of Unknown Dependence

    Directory of Open Access Journals (Sweden)

    Touhid Bhuiyan

    2010-03-01

    Full Text Available Trust can be used to improve online automated recommendation within a given domain. Trust transitivity is used to make it successful. But trust transitivity has different interpretations. Trust and trust transitivity; both are the human mental phenomenon and for this reason, there is no such thing as objective transitivity. Trust transitivity and trust fusion both are important elements in computational trust. This paper analyses the parameter dependence problem in trust transitivity and proposes some definitions considering the effects of base rate. In addition, it also proposes belief functions based on subjective logic to analyse trust transitivity of three specified cases with sensitive and insensitive based rate. Then it presents a quantitative analysis of the effects of unknown dependence problem in an interconnected network environment; such Internet.

  10. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  11. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake

    International Nuclear Information System (INIS)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi; Yamaguchi, Hiroo; Kira, Jun-ichi

    2017-01-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of 123 I-MIBG uptake in patients with PD. (orig.)

  12. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic - Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences.

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered.

  13. Dietary supplement use and colorectal cancer risk: A systematic review and meta-analyses of prospective cohort studies

    NARCIS (Netherlands)

    Heine-Bröring, R.C.; Winkels, R.M.; Renkema, J.M.S.; Kragt, L.; Orten-Luiten, van A.C.B.; Tigchelaar, E.F.; Chan, D.S.M.; Norat, T.; Kampman, E.

    2015-01-01

    Use of dietary supplements is rising in countries where colorectal cancer is prevalent. We conducted a systematic literature review and meta-analyses of prospective cohort studies on dietary supplement use and colorectal cancer risk. We identified relevant studies in Medline, Embase and Cochrane up

  14. Candelariella placodizans (Candelariaceae reported new to mainland China and Taiwan based on morphological, chemical and molecular phylogenetic analyses

    Directory of Open Access Journals (Sweden)

    Lidia Yakovchenko

    2016-06-01

    Full Text Available Candelariella placodizans is newly reported from China. It was collected on exposed rocks with mosses on the alpine areas of Taiwan and Yunnan Province, China at elevation between 3200-4400 m. Molecular phylogenetic analyses based on ITS rDNA sequences were also performed to confirm the monophyly of the Chinese populations with respect to already existing sequences of the species, and then further to examine their relationships to other members of the genus. An identification key to all 14 known taxa of Candelariella in China is provided.

  15. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  16. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    Science.gov (United States)

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  18. Chemical and geotechnical analyses of soil samples from Olkiluoto for studies on sorption in soils

    International Nuclear Information System (INIS)

    Lusa, M.; Aemmaelae, K.; Hakanen, M.; Lehto, J.; Lahdenperae, A.-M.

    2009-05-01

    The safety assessment of disposal of spent nuclear fuel will include an estimate on the behavior of nuclear waste nuclides in the biosphere. As a part of this estimate also the transfer of nuclear waste nuclides in the soil and sediments is to be considered. In this study soil samples were collected from three excavator pits in Olkiluoto and the geotechnical and chemical characteristics of the samples were determined. In later stage these results will be used in sorption tests. Aim of these tests is to determine the Kd-values for Cs, Tc and I and later for Mo, Nb and Cl. Results of these sorption tests will be reported later. The geotechnical characteristics studied included dry weight and organic matter content as well as grain size distribution and mineralogy analyses. Selective extractions were carried out to study the sorption of cations into different mineral types. The extractions included five steps in which the cations bound to exchangeable, carbonate, oxides of Fe and Mn, organic matter and residual fractions were determined. For all fractions ICPMS analyses were carried out. In these analyses Li, Na, Mg, K, Ca, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Mo, Cd, Cs and Pb were determined. In addition six profiles were taken from the surroundings of two excavator pits for the 137 Cs determination. Besides the samples taken for the characterization of soil, supplement samples were taken from the same layers for the separation of soil water. From the soil water pH, DOC, anions (F, Cl, NO 3 , SO 4 ) and cations (Na, Mg, K, Ca, Al, Cr, Mn, Fe, Ni, Cu, Zn, As, S, Cd, Cs, Pb, U) were determined. (orig.)

  19. Human cell-based micro electrode array platform for studying neurotoxicity

    Directory of Open Access Journals (Sweden)

    Laura eYlä-Outinen

    2010-09-01

    Full Text Available At present, most of the neurotoxicological analyses are based on in vitro and in vivo models utilizing animal cells or animal models. In addition, the used in vitro models are mostly based on molecular biological end-point analyses. Thus, for neurotoxicological screening, human cell-based analysis platforms in which the functional neuronal networks responses for various neurotoxicants can be also detected real-time are highly needed. Microelectrode array (MEA is a method which enables the measurement of functional activity of neuronal cell networks in vitro for long periods of time. Here, we utilize MEA to study the neurotoxicity of methyl mercury chloride (MeHgCl, concentrations 0.5-500 nM to human embryonic stem cell (hESC-derived neuronal cell networks exhibiting spontaneous electrical activity. The neuronal cell cultures were matured on MEAs into networks expressing spontaneous spike train-like activity before exposing the cells to MeHgCl for 72 hours. MEA measurements were performed acutely and 24, 48, and 72 hours after the onset of the exposure. Finally, exposed cells were analyzed with traditional molecular biological methods for cell proliferation, cell survival, and gene and protein expression. Our results show that 500 nM MeHgCl decreases the electrical signaling and alters the pharmacologic response of hESC-derived neuronal networks in delayed manner whereas effects can not be detected with qRT-PCR, immunostainings, or proliferation measurements. Thus, we conclude that human cell-based MEA-platform is a sensitive online method for neurotoxicological screening.

  20. Improving correlations between MODIS aerosol optical thickness and ground-based PM 2.5 observations through 3D spatial analyses

    Science.gov (United States)

    Hutchison, Keith D.; Faruqui, Shazia J.; Smith, Solar

    The Center for Space Research (CSR) continues to focus on developing methods to improve correlations between satellite-based aerosol optical thickness (AOT) values and ground-based, air pollution observations made at continuous ambient monitoring sites (CAMS) operated by the Texas commission on environmental quality (TCEQ). Strong correlations and improved understanding of the relationships between satellite and ground observations are needed to formulate reliable real-time predictions of air quality using data accessed from the moderate resolution imaging spectroradiometer (MODIS) at the CSR direct-broadcast ground station. In this paper, improvements in these correlations are demonstrated first as a result of the evolution in the MODIS retrieval algorithms. Further improvement is then shown using procedures that compensate for differences in horizontal spatial scales between the nominal 10-km MODIS AOT products and CAMS point measurements. Finally, airborne light detection and ranging (lidar) observations, collected during the Texas Air Quality Study of 2000, are used to examine aerosol profile concentrations, which may vary greatly between aerosol classes as a result of the sources, chemical composition, and meteorological conditions that govern transport processes. Further improvement in correlations is demonstrated with this limited dataset using insights into aerosol profile information inferred from the vertical motion vectors in a trajectory-based forecast model. Analyses are ongoing to verify these procedures on a variety of aerosol classes using data collected by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite (Calipso) lidar.

  1. Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.

    2017-10-01

    In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.

  2. Can We Study Autonomous Driving Comfort in Moving-Base Driving Simulators? A Validation Study.

    Science.gov (United States)

    Bellem, Hanna; Klüver, Malte; Schrauf, Michael; Schöner, Hans-Peter; Hecht, Heiko; Krems, Josef F

    2017-05-01

    To lay the basis of studying autonomous driving comfort using driving simulators, we assessed the behavioral validity of two moving-base simulator configurations by contrasting them with a test-track setting. With increasing level of automation, driving comfort becomes increasingly important. Simulators provide a safe environment to study perceived comfort in autonomous driving. To date, however, no studies were conducted in relation to comfort in autonomous driving to determine the extent to which results from simulator studies can be transferred to on-road driving conditions. Participants ( N = 72) experienced six differently parameterized lane-change and deceleration maneuvers and subsequently rated the comfort of each scenario. One group of participants experienced the maneuvers on a test-track setting, whereas two other groups experienced them in one of two moving-base simulator configurations. We could demonstrate relative and absolute validity for one of the two simulator configurations. Subsequent analyses revealed that the validity of the simulator highly depends on the parameterization of the motion system. Moving-base simulation can be a useful research tool to study driving comfort in autonomous vehicles. However, our results point at a preference for subunity scaling factors for both lateral and longitudinal motion cues, which might be explained by an underestimation of speed in virtual environments. In line with previous studies, we recommend lateral- and longitudinal-motion scaling factors of approximately 50% to 60% in order to obtain valid results for both active and passive driving tasks.

  3. Analysing Leontiev Tube Capabilities in the Space-based Plants

    Directory of Open Access Journals (Sweden)

    N. L. Shchegolev

    2017-01-01

    Full Text Available The paper presents a review of publications dedicated to the gas-dynamic temperature stratification device (the Leontief tube and shows main factors affecting its efficiency. Describes an experimental installation, which is used to obtain data on the value of energy separation in the air to prove this device the operability.The assumption that there is an optimal relationship between the flow velocities in the subsonic and supersonic channels of the gas-dynamic temperature stratification device is experimentally confirmed.The paper conducts analysis of possible ways to raise the efficiency of power plants of various (including space basing, and shows that, currently, a mainstream of increasing efficiency of their operation is to complicate design solutions.A scheme of the closed gas-turbine space-based plant using a mixture of inert gases (helium-xenon one for operation is proposed. What differs it from the simplest variants is a lack of the cooler-radiator and integration into gas-dynamic temperature stratification device and heat compressor.Based on the equations of one-dimensional gas dynamics, it is shown that the total pressure restorability when removing heat in a thermal compressor determines operating capability of this scheme. The exploratory study of creating a heat compressor is performed, and it is shown that when operating on gases with a Prandtl number close to 1, the total pressure does not increase.The operating capability conditions of the heat compressor are operation on gases with a low value of the Prandtl number (helium-xenon mixture at high supersonic velocities and with a longitudinal pressure gradient available.It is shown that there is a region of the low values of the Prandtl number (Pr <0.3 for which, with the longitudinal pressure gradient available in the supersonic flows of a viscous gas, the total pressure can be restored.

  4. Status of science and technology with respect of preparation and evaluation of accident analyses and the use of analysis simulators

    International Nuclear Information System (INIS)

    Pointner, Winfried; Cuesta Morales, Alejandra; Draeger, Peer; Hartung, Juergen; Jakubowski, Zygmunt; Meyer, Gerhard; Palazzo, Simone; Moner, Guim Pallas; Perin, Yann; Pasichnyk, Ihor

    2014-07-01

    The scope of the work was to elaborate the prerequisites for short term accident analyses including recommendations for the application of new methodologies and computational procedures and technical aspects of safety evaluation. The following work packages were performed: Knowledge base for best estimate accident analyses; analytical studies on the PWR plant behavior in case of multiple safety system failures; extension and maintenance of the data base for plant specific analysis simulators.

  5. Model-based performance and energy analyses of reverse osmosis to reuse wastewater in a PVC production site.

    Science.gov (United States)

    Hu, Kang; Fiedler, Thorsten; Blanco, Laura; Geissen, Sven-Uwe; Zander, Simon; Prieto, David; Blanco, Angeles; Negro, Carlos; Swinnen, Nathalie

    2017-11-10

    A pilot-scale reverse osmosis (RO) followed behind a membrane bioreactor (MBR) was developed for the desalination to reuse wastewater in a PVC production site. The solution-diffusion-film model (SDFM) based on the solution-diffusion model (SDM) and the film theory was proposed to describe rejections of electrolyte mixtures in the MBR effluent which consists of dominant ions (Na + and Cl - ) and several trace ions (Ca 2+ , Mg 2+ , K + and SO 4 2- ). The universal global optimisation method was used to estimate the ion permeability coefficients (B) and mass transfer coefficients (K) in SDFM. Then, the membrane performance was evaluated based on the estimated parameters which demonstrated that the theoretical simulations were in line with the experimental results for the dominant ions. Moreover, an energy analysis model with the consideration of limitation imposed by the thermodynamic restriction was proposed to analyse the specific energy consumption of the pilot-scale RO system in various scenarios.

  6. Importance of frequency dependent magnetoresistance measurements in analysing the intrinsicality of magnetodielectric effect: A case study

    Science.gov (United States)

    Rai, Hari Mohan; Saxena, Shailendra K.; Mishra, Vikash; Kumar, Rajesh; Sagdeo, P. R.

    2017-08-01

    Magnetodielectric (MD) materials have attracted considerable attention due to their intriguing physics and potential future applications. However, the intrinsicality of the MD effect is always a major concern in such materials as the MD effect may arise also due to the MR (magnetoresistance) effect. In the present case study, we report an experimental approach to analyse and separate the intrinsic and MR dominated contributions of the MD phenomenon. For this purpose, polycrystalline samples of LaGa1-xAxO3 (A = Mn/Fe) have been prepared by solid state reaction method. The purity of their structural phase (orthorhombic) has been validated by refining the X-ray diffraction data. The RTMD (room temperature MD) response has been recorded over a frequency range of 20 Hz to 10 MHz. In order to analyse the intrinsicality of the MD effect, FDMR (frequency dependent MR) by means of IS (impedance spectroscopy) and dc MR measurements in four probe geometry have been carried out at RT. A significant RTMD effect has been observed in selected Mn/Fe doped LaGaO3 (LGO) compositions. The mechanism of MR free/intrinsic MD effect, observed in Mn/Fe doped LGO, has been understood speculatively in terms of modified cell volume associated with the reorientation/retransformation of spin-coupled Mn/Fe orbitals due to the application of magnetic field. The present analysis suggests that in order to justify the intrinsic/resistive origin of the MD phenomenon, FDMR measurements are more useful than measuring only dc MR or analysing the trends of magnetic field dependent change in the dielectric constant and tanδ. On the basis of the present case study, we propose that IS (FDMR) alone can be used as an effective experimental tool to detect and analyse the resistive and intrinsic parts contributing to the MD phenomenon.

  7. C4P cross-section libraries for safety analyses with SIMMER and related studies

    International Nuclear Information System (INIS)

    Rineiski, A.; Sinitsa, V.; Gabrielli, F.; Maschek, W.

    2011-01-01

    A code and data system, C 4 P, is under development at KIT. It includes fine-group master libraries and tools for generating problem-oriented cross-section libraries, primarily for safety studies with the SIMMER code and related analyses. In the paper, the 560-group master library and problem oriented 40-group and 72-group cross-section libraries, for thermal and fast systems, respectively, are described and their performances are investigated. (author)

  8. Deconstructing tolerance with clobazam: Post hoc analyses from an open-label extension study.

    Science.gov (United States)

    Gidal, Barry E; Wechsler, Robert T; Sankar, Raman; Montouris, Georgia D; White, H Steve; Cloyd, James C; Kane, Mary Clare; Peng, Guangbin; Tworek, David M; Shen, Vivienne; Isojarvi, Jouko

    2016-10-25

    To evaluate potential development of tolerance to adjunctive clobazam in patients with Lennox-Gastaut syndrome. Eligible patients enrolled in open-label extension study OV-1004, which continued until clobazam was commercially available in the United States or for a maximum of 2 years outside the United States. Enrolled patients started at 0.5 mg·kg -1 ·d -1 clobazam, not to exceed 40 mg/d. After 48 hours, dosages could be adjusted up to 2.0 mg·kg -1 ·d -1 (maximum 80 mg/d) on the basis of efficacy and tolerability. Post hoc analyses evaluated mean dosages and drop-seizure rates for the first 2 years of the open-label extension based on responder categories and baseline seizure quartiles in OV-1012. Individual patient listings were reviewed for dosage increases ≥40% and increasing seizure rates. Data from 200 patients were included. For patients free of drop seizures, there was no notable change in dosage over 24 months. For responder groups still exhibiting drop seizures, dosages were increased. Weekly drop-seizure rates for 100% and ≥75% responders demonstrated a consistent response over time. Few patients had a dosage increase ≥40% associated with an increase in seizure rates. Two-year findings suggest that the majority of patients do not develop tolerance to the antiseizure actions of clobazam. Observed dosage increases may reflect best efforts to achieve seizure freedom. It is possible that the clinical development of tolerance to clobazam has been overstated. NCT00518713 and NCT01160770. This study provides Class III evidence that the majority of patients do not develop tolerance to clobazam over 2 years of treatment. © 2016 American Academy of Neurology.

  9. Lagrangian Coherent Structure Analysis of Terminal Winds: Three-Dimensionality, Intramodel Variations, and Flight Analyses

    Directory of Open Access Journals (Sweden)

    Brent Knutson

    2015-01-01

    Full Text Available We present a study of three-dimensional Lagrangian coherent structures (LCS near the Hong Kong International Airport and relate to previous developments of two-dimensional (2D LCS analyses. The LCS are contrasted among three independent models and against 2D coherent Doppler light detection and ranging (LIDAR data. Addition of the velocity information perpendicular to the LIDAR scanning cone helps solidify flow structures inferred from previous studies; contrast among models reveals the intramodel variability; and comparison with flight data evaluates the performance among models in terms of Lagrangian analyses. We find that, while the three models and the LIDAR do recover similar features of the windshear experienced by a landing aircraft (along the landing trajectory, their Lagrangian signatures over the entire domain are quite different—a portion of each numerical model captures certain features resembling those LCS extracted from independent 2D LIDAR analyses based on observations.

  10. Antibacterial Efficiency of Benzalkonium Chloride Base Disinfectant According To European Standard 13727, Chemical Analysis and Validation Studies

    OpenAIRE

    Yıldırım, Çinel; Çelenk, Veysel

    2018-01-01

    Antibacterial Efficiency of Benzalkonium Chloride Base Disinfectant According To European Standard 13727, Chemical Analysis and Validation Studies This study was aimed to provide principle of the chemical analyses, antibacterial efficiency test and validation procedures of the most commonly used benzalkonium chloride (BAC) base disinfectant as a biocide. Disinfectant which comprised 20 % BAC concentration was used as a prototype product and active substance was verified with chemical analysis...

  11. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  12. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  13. Brain areas associated with numbers and calculations in children: Meta-analyses of fMRI studies

    Directory of Open Access Journals (Sweden)

    Marie Arsalidou

    2018-04-01

    Full Text Available Children use numbers every day and typically receive formal mathematical training from an early age, as it is a main subject in school curricula. Despite an increase in children neuroimaging studies, a comprehensive neuropsychological model of mathematical functions in children is lacking. Using quantitative meta-analyses of functional magnetic resonance imaging (fMRI studies, we identify concordant brain areas across articles that adhere to a set of selection criteria (e.g., whole-brain analysis, coordinate reports and report brain activity to tasks that involve processing symbolic and non-symbolic numbers with and without formal mathematical operations, which we called respectively number tasks and calculation tasks. We present data on children 14 years and younger, who solved these tasks. Results show activity in parietal (e.g., inferior parietal lobule and precuneus and frontal (e.g., superior and medial frontal gyri cortices, core areas related to mental-arithmetic, as well as brain regions such as the insula and claustrum, which are not typically discussed as part of mathematical problem solving models. We propose a topographical atlas of mathematical processes in children, discuss findings within a developmental constructivist theoretical model, and suggest practical methodological considerations for future studies. Keywords: Mathematical cognition, Meta-analyses, fMRI, Children, Development, Insula

  14. The development of an on-line gold analyser

    International Nuclear Information System (INIS)

    Robert, R.V.D.; Ormrod, G.T.W.

    1982-01-01

    An on-line analyser to monitor the gold in solutions from the carbon-in-pulp process is described. The automatic system is based on the delivery of filtered samples of the solutions to a distribution valve for measurement by flameless atomic-absorption spectrophotometry. The samples is introduced by the aerosol-deposition method. Operation of the analyser on a pilot plant and on a full-scale carbon-in-pulp plant has shown that the system is economically feasible and capable of providing a continuous indication of the efficiency of the extraction process

  15. Analyses in Support of Z-IFE LLNL Progress Report for FY-05

    International Nuclear Information System (INIS)

    Moir, R W; Abbott, R P; Callahan, D A; Latkowski, J F; Meier, W R; Reyes, S

    2005-01-01

    The FY04 LLNL study of Z-IFE [1] proposed and evaluated a design that deviated from SNL's previous baseline design. The FY04 study included analyses of shock mitigation, stress in the first wall, neutronics and systems studies. In FY05, the subject of this report, we build on our work and the theme of last year. Our emphasis continues to be on alternatives that hold promise of considerable improvements in design and economics compared to the base-line design. Our key results are summarized here

  16. Analyses of liquid-gas two-phase flow in fermentation tanks

    International Nuclear Information System (INIS)

    Toi, Takashi; Serizawa, Akimi; Takahashi, Osamu; Kawara, Zensaku; Gofuku, Akio; Kataoka, Isao.

    1993-01-01

    The understanding of two-phase flow is one of the important problems for both design and safety analyses of various engineering systems. For example, the flow conditions in beer fermentation tanks have an influence on the quality of production and productivity of tank. In this study, a two-dimensional numerical calculation code based on the one-pressure two-fluid model is developed to understand the circulation structure of low quality liquid-gas two-phase flows induced by bubble plume in a tank. (author)

  17. Systematic literature reviews and meta-analyses: part 6 of a series on evaluation of scientific publications.

    Science.gov (United States)

    Ressing, Meike; Blettner, Maria; Klug, Stefanie J

    2009-07-01

    Because of the rising number of scientific publications, it is important to have a means of jointly summarizing and assessing different studies on a single topic. Systematic literature reviews, meta-analyses of published data, and meta-analyses of individual data (pooled reanalyses) are now being published with increasing frequency. We here describe the essential features of these methods and discuss their strengths and weaknesses. This article is based on a selective literature search. The different types of review and meta-analysis are described, the methods used in each are outlined so that they can be evaluated, and a checklist is given for the assessment of reviews and meta-analyses of scientific articles. Systematic literature reviews provide an overview of the state of research on a given topic and enable an assessment of the quality of individual studies. They also allow the results of different studies to be evaluated together when these are inconsistent. Meta-analyses additionally allow calculation of pooled estimates of an effect. The different types of review and meta-analysis are discussed with examples from the literature on one particular topic. Systematic literature reviews and meta-analyses enable the research findings and treatment effects obtained in different individual studies to be summed up and evaluated.

  18. Methodology for processing pressure traces used as inputs for combustion analyses in diesel engines

    International Nuclear Information System (INIS)

    Rašić, Davor; Vihar, Rok; Baškovič, Urban Žvar; Katrašnik, Tomaž

    2017-01-01

    This study proposes a novel methodology for designing an optimum equiripple finite impulse response (FIR) filter for processing in-cylinder pressure traces of a diesel internal combustion engine, which serve as inputs for high-precision combustion analyses. The proposed automated workflow is based on an innovative approach of determining the transition band frequencies and optimum filter order. The methodology is based on discrete Fourier transform analysis, which is the first step to estimate the location of the pass-band and stop-band frequencies. The second step uses short-time Fourier transform analysis to refine the estimated aforementioned frequencies. These pass-band and stop-band frequencies are further used to determine the most appropriate FIR filter order. The most widely used existing methods for estimating the FIR filter order are not effective in suppressing the oscillations in the rate- of-heat-release (ROHR) trace, thus hindering the accuracy of combustion analyses. To address this problem, an innovative method for determining the order of an FIR filter is proposed in this study. This method is based on the minimization of the integral of normalized signal-to-noise differences between the stop-band frequency and the Nyquist frequency. Developed filters were validated using spectral analysis and calculation of the ROHR. The validation results showed that the filters designed using the proposed innovative method were superior compared with those using the existing methods for all analyzed cases. Highlights • Pressure traces of a diesel engine were processed by finite impulse response (FIR) filters with different orders • Transition band frequencies were determined with an innovative method based on discrete Fourier transform and short-time Fourier transform • Spectral analyses showed deficiencies of existing methods in determining the FIR filter order • A new method of determining the FIR filter order for processing pressure traces was

  19. Differentiation of Toxocara canis and Toxocara cati based on PCR-RFLP analyses of rDNA-ITS and mitochondrial cox1 and nad1 regions.

    Science.gov (United States)

    Mikaeili, Fattaneh; Mathis, Alexander; Deplazes, Peter; Mirhendi, Hossein; Barazesh, Afshin; Ebrahimi, Sepideh; Kia, Eshrat Beigom

    2017-09-26

    The definitive genetic identification of Toxocara species is currently based on PCR/sequencing. The objectives of the present study were to design and conduct an in silico polymerase chain reaction-restriction fragment length polymorphism method for identification of Toxocara species. In silico analyses using the DNASIS and NEBcutter softwares were performed with rDNA internal transcribed spacers, and mitochondrial cox1 and nad1 sequences obtained in our previous studies along with relevant sequences deposited in GenBank. Consequently, RFLP profiles were designed and all isolates of T. canis and T. cati collected from dogs and cats in different geographical areas of Iran were investigated with the RFLP method using some of the identified suitable enzymes. The findings of in silico analyses predicted that on the cox1 gene only the MboII enzyme is appropriate for PCR-RFLP to reliably distinguish the two species. No suitable enzyme for PCR-RFLP on the nad1 gene was identified that yields the same pattern for all isolates of a species. DNASIS software showed that there are 241 suitable restriction enzymes for the differentiation of T. canis from T. cati based on ITS sequences. RsaI, MvaI and SalI enzymes were selected to evaluate the reliability of the in silico PCR-RFLP. The sizes of restriction fragments obtained by PCR-RFLP of all samples consistently matched the expected RFLP patterns. The ITS sequences are usually conserved and the PCR-RFLP approach targeting the ITS sequence is recommended for the molecular differentiation of Toxocara species and can provide a reliable tool for identification purposes particularly at the larval and egg stages.

  20. Development of ITER 3D neutronics model and nuclear analyses

    International Nuclear Information System (INIS)

    Zeng, Q.; Zheng, S.; Lu, L.; Li, Y.; Ding, A.; Hu, H.; Wu, Y.

    2007-01-01

    ITER nuclear analyses rely on the calculations with the three-dimensional (3D) Monte Carlo code e.g. the widely-used MCNP. However, continuous changes in the design of the components require the 3D neutronics model for nuclear analyses should be updated. Nevertheless, the modeling of a complex geometry with MCNP by hand is a very time-consuming task. It is an efficient way to develop CAD-based interface code for automatic conversion from CAD models to MCNP input files. Based on the latest CAD model and the available interface codes, the two approaches of updating 3D nuetronics model have been discussed by ITER IT (International Team): The first is to start with the existing MCNP model 'Brand' and update it through a combination of direct modification of the MCNP input file and generation of models for some components directly from the CAD data; The second is to start from the full CAD model, make the necessary simplifications, and generate the MCNP model by one of the interface codes. MCAM as an advanced CAD-based MCNP interface code developed by FDS Team in China has been successfully applied to update the ITER 3D neutronics model by adopting the above two approaches. The Brand model has been updated to generate portions of the geometry based on the newest CAD model by MCAM. MCAM has also successfully performed conversion to MCNP neutronics model from a full ITER CAD model which is simplified and issued by ITER IT to benchmark the above interface codes. Based on the two updated 3D neutronics models, the related nuclear analyses are performed. This paper presents the status of ITER 3D modeling by using MCAM and its nuclear analyses, as well as a brief introduction of advanced version of MCAM. (authors)

  1. Holistic stakeholder-oriented and case study-based risk analysis

    Science.gov (United States)

    Heisterkamp, Tobias

    2013-04-01

    Case studies of storm events in the Berlin conurbation demonstrate the chance of a holistic approach and its potential data sources. Data sets of population, but also data provided by insurance and transport companies, and operating data provided by fire brigades, are used. Various indicators for risk analysis are constructed to identify hot spots. These hot spots can be shortcomings or critical aspects in structure, communication, the warning chain, or even in the structure of potentially affected stakeholders or in the civil protection system itself. Due to increasing complexity of interactions and interdependencies in or between societies and nature, it is important to choose a holistic approach. For risk analyses like the storms in Berlin, it captures many important factors with their effects. For risk analyses, it is important to take potential users into concern: The analysis gets important due to its use later on. In addition to a theoretical background, a focus on the application should be set from the beginning on. To get usable results, it is helpful to complement the theoretical meta-level by a stakeholder-oriented level. An iterative investigation and combination of different layers for the risk analysis explores important influencing factors and allows a tailoring of results to different stakeholder groups. Layers are indicators, gained from data sets like losses from insurance data. Tailoring is important, because of different requirements e.g. by technical or medical assistance. Stakeholders' feedback in the iterative investigation also shows structural limitations for later applications, like special laws the fire brigades have to deal with. Additionally, using actors' perspectives offers the chance to convince practitioners of taking part in the analysis. Their participation is an essential component in applied science. They are important data suppliers, whose goodwill is needed to ensure good results. Based on their experience, they can also help

  2. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  3. Spent fuel shipping costs for transportation logistics analyses

    International Nuclear Information System (INIS)

    Cole, B.M.; Cross, R.E.; Cashwell, J.W.

    1983-05-01

    Logistics analyses supplied to the nuclear waste management programs of the U.S. Department of Energy through the Transportation Technology Center (TTC) at Sandia National Laboratories are used to predict nuclear waste material logistics, transportation packaging demands, shipping and receiving rates and transportation-related costs for alternative strategies. This study is an in-depth analysis of the problems and contingencies associated with the costs of shipping irradiated reactor fuel. These costs are extremely variable however, and have changed frequently (sometimes monthly) during the past few years due to changes in capital, fuel, and labor costs. All costs and charges reported in this study are based on January 1982 data using existing transport cask systems and should be used as relative indices only. Actual shipping costs would be negotiable for each origin-destination combination

  4. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on {sup 123}I-MIBG uptake

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi [Kyushu University, Department of Clinical Radiology, Graduate School of Medical Sciences, Fukuoka (Japan); Yamaguchi, Hiroo; Kira, Jun-ichi [Kyushu University, Department of Neurology, Graduate School of Medical Sciences, Fukuoka (Japan)

    2017-12-15

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and {sup 123}I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and {sup 123}I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar {sup 123}I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of {sup 123}I-MIBG uptake in patients with PD. (orig.)

  5. Probabilistic and Nonprobabilistic Sensitivity Analyses of Uncertain Parameters

    Directory of Open Access Journals (Sweden)

    Sheng-En Fang

    2014-01-01

    Full Text Available Parameter sensitivity analyses have been widely applied to industrial problems for evaluating parameter significance, effects on responses, uncertainty influence, and so forth. In the interest of simple implementation and computational efficiency, this study has developed two sensitivity analysis methods corresponding to the situations with or without sufficient probability information. The probabilistic method is established with the aid of the stochastic response surface and the mathematical derivation proves that the coefficients of first-order items embody the parameter main effects on the response. Simultaneously, a nonprobabilistic interval analysis based method is brought forward for the circumstance when the parameter probability distributions are unknown. The two methods have been verified against a numerical beam example with their accuracy compared to that of a traditional variance-based method. The analysis results have demonstrated the reliability and accuracy of the developed methods. And their suitability for different situations has also been discussed.

  6. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic – Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered. PMID:28122062

  7. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.; Haroon, Mohamed; Ngugi, David; Thompson, Luke R.; Stingl, Ulrich

    2016-01-01

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  8. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.

    2016-06-25

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  9. Study on the tritium behaviors in the VHTR system. Part 2: Analyses on the tritium behaviors in the VHTR/HTSE system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eung S. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3885 (United States); Oh, Chang H., E-mail: Chang.Oh@inl.go [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3885 (United States); Patterson, Mike [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3885 (United States)

    2010-07-15

    Tritium behaviors in the very high temperature gas reactor (VHTR)/high temperature steam electrolysis (HTSE) system have been analyzed by the TPAC developed by Idaho National Laboratory (INL). The reference system design and conditions were based on the indirect parallel configuration between a VHTR and a HTSE. The analyses were based on the SOBOL method, a modern uncertainty and sensitivity analyses method using variance decomposition and Monte Carlo method. A total of 14 parameters have been taken into account associated with tritium sources, heat exchangers, purification systems, and temperatures. Two sensitivity indices (first order index and total index) were considered, and 15,360 samples were totally used for solution convergence. As a result, important parameters that affect tritium concentration in the hydrogen product have been identified and quantified with the rankings. Several guidelines and recommendations for reducing modeling uncertainties have been also provided throughout the discussions along with some useful ideas for mitigating tritium contaminations in the hydrogen product.

  10. Applicability study of deuterium excess in bottled water life cycle analyses

    Directory of Open Access Journals (Sweden)

    Mihael Brenčič

    2014-12-01

    Full Text Available Paper explores the possible use of d‑excess in the investigation of bottled water. Based on the data set from Brencic and Vreca’s paper (2006. Identification of sources and production processes of bottled waters by stable hydrogen and oxygen isotope ratios, d‑excess values were statistically analysed and compared among different bottled water groups and different bottlers. The bottled water life cycle in relation to d‑excess values was also theoretically identified. Descriptive statistics and one-way ANOVA showed no significant differences among the groups. Differences were detected in the shape of empirical distributions. Groups of still and flavoured waters have similar shapes, but sparkling waters differed to the others. Two distinctive groups of bottlers could be discerned. The first group is represented by bottlers with a high range of d‑excess (from 7.7 ‰ to 18.6 ‰ with average of 12.0 ‰ exploring waters originating from the aquifers rich in highly mineralised groundwater and relatively high concentrations of CO2 gas. The second group is represented by bottlers using groundwater from relatively shallow aquifers. Their d‑excess values have characteristics similar to the local precipitation (from 7.8 ‰ to 14.3 ‰ with average of 10.3 ‰. More frequent sampling and better knowledge of production phases are needed to improve usage of isotope fingerprint for authentication of bottled waters.

  11. Sediment Characteristics of Mergui Basin, Andaman Sea based on Multi-proxy Analyses

    Directory of Open Access Journals (Sweden)

    Rina Zuraida

    2018-02-01

    Full Text Available This paper presents the characteristics of sediment from core BS-36 (6°55.85’ S and 96°7.48’ E, 1147.1 m water depth that was acquired in the Mergui Basin, Andaman Sea. The analyses involved megascopic description, core scanning by multi-sensor core logger, and carbonate content measurement. The purpose of this study is to determine the physical and chemical characteristics of sediment to infer the depositional environment. The results show that this core can be divided into 5 lithologic units that represent various environmental conditions. The sedimentation of the bottom part, Units V and IV were inferred to be deposited in suboxic to anoxic bottom condition combined with high productivity and low precipitation. Unit III was deposited during high precipitation and oxic condition due to ocean ventilation. In the upper part, Units II and I occurred during higher precipitation, higher carbonate production and suboxic to anoxic condition. Keywords: sediment characteristics, Mergui Basin, Andaman Sea, suboxic, anoxic, oxic, carbonate content

  12. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  13. Assessment of the Turkish utility sector through energy and exergy analyses

    International Nuclear Information System (INIS)

    Utlu, Zafer; Hepbasli, Arif

    2007-01-01

    The present study deals with evaluating the utility sector in terms of energetic and exergetic aspects. In this regard, energy and exergy utilization efficiencies in the Turkish utility sector over a wide range of period from 1990 to 2004 are assessed in this study. Energy and exergy analyses are performed for eight power plant modes, while they are based on the actual data over the period studied. Sectoral energy and exergy analyses are conducted to study the variations of energy and exergy efficiencies for each power plants throughout the years, and overall energy and exergy efficiencies are compared for these power plants. The energy utilization efficiencies for the overall Turkish utility sector range from 32.64% to 45.69%, while the exergy utilization efficiencies vary from 32.20% to 46.81% in the analyzed years. Exergetic improvement potential for this sector are also determined to be 332 PJ in 2004. It may be concluded that the methodology used in this study is practical and useful for analyzing sectoral and subsectoral energy and exergy utilization to determine how efficient energy and exergy are used in the sector studied. It is also expected that the results of this study will be helpful in developing highly applicable and productive planning for energy policies

  14. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  15. ENERGY AND ENTROPY ANALYSES OF AN EXPERIMENTAL TURBOJET ENGINE FOR TARGET DRONE APPLICATION

    Directory of Open Access Journals (Sweden)

    Onder Turan

    2016-12-01

    Full Text Available This study investigates energy and entropy analyses of an experimental turbojet engine build in Anadolu University Faculty of Aeronautics and Astronautics Test-Cell Laboratory.  Law of motions and Brayton thermodynamic cycle model are used for this purpose. The processes (that is, compression, combustion, and expansion are simulated in P-v, T-s and h-s diagrams. Furthermore, the second law of thermodynamics is applied to the cycle model to perform the entropy analysis. A distribution of the wasted and thrust power, the overall (energy-based the first law efficiency, and the specific fuel consumption and specific thrust of the engine were calculated during the analyses as well. The results of the study also show the entropy changing value in engine components due to irreversibilities and inefficiencies. As a conclusion, it is expected that this study is useful to study future design and research work similar aircraft turbojets, auxiliary power units and target drone power systems.

  16. Averaging Gone Wrong: Using Time-Aware Analyses to Better Understand Behavior

    OpenAIRE

    Barbosa, Samuel; Cosley, Dan; Sharma, Amit; Cesar-Jr, Roberto M.

    2016-01-01

    Online communities provide a fertile ground for analyzing people's behavior and improving our understanding of social processes. Because both people and communities change over time, we argue that analyses of these communities that take time into account will lead to deeper and more accurate results. Using Reddit as an example, we study the evolution of users based on comment and submission data from 2007 to 2014. Even using one of the simplest temporal differences between users---yearly coho...

  17. Fetal growth and risk of stillbirth: a population-based case-control study.

    Science.gov (United States)

    Bukowski, Radek; Hansen, Nellie I; Willinger, Marian; Reddy, Uma M; Parker, Corette B; Pinar, Halit; Silver, Robert M; Dudley, Donald J; Stoll, Barbara J; Saade, George R; Koch, Matthew A; Rowland Hogue, Carol J; Varner, Michael W; Conway, Deborah L; Coustan, Donald; Goldenberg, Robert L

    2014-04-01

    Stillbirth is strongly related to impaired fetal growth. However, the relationship between fetal growth and stillbirth is difficult to determine because of uncertainty in the timing of death and confounding characteristics affecting normal fetal growth. We conducted a population-based case-control study of all stillbirths and a representative sample of live births in 59 hospitals in five geographic areas in the US. Fetal growth abnormalities were categorized as small for gestational age (SGA) (90th percentile) at death (stillbirth) or delivery (live birth) using population, ultrasound, and individualized norms. Gestational age at death was determined using an algorithm that considered the time-of-death interval, postmortem examination, and reliability of the gestational age estimate. Data were weighted to account for the sampling design and differential participation rates in various subgroups. Among 527 singleton stillbirths and 1,821 singleton live births studied, stillbirth was associated with SGA based on population, ultrasound, and individualized norms (odds ratio [OR] [95% CI]: 3.0 [2.2 to 4.0]; 4.7 [3.7 to 5.9]; 4.6 [3.6 to 5.9], respectively). LGA was also associated with increased risk of stillbirth using ultrasound and individualized norms (OR [95% CI]: 3.5 [2.4 to 5.0]; 2.3 [1.7 to 3.1], respectively), but not population norms (OR [95% CI]: 0.6 [0.4 to 1.0]). The associations were stronger with more severe SGA and LGA (95th percentile). Analyses adjusted for stillbirth risk factors, subset analyses excluding potential confounders, and analyses in preterm and term pregnancies showed similar patterns of association. In this study 70% of cases and 63% of controls agreed to participate. Analysis weights accounted for differences between consenting and non-consenting women. Some of the characteristics used for individualized fetal growth estimates were missing and were replaced with reference values. However, a sensitivity analysis using individualized norms

  18. Ancient DNA analyses of museum specimens from selected Presbytis (primate: Colobinae) based on partial Cyt b sequences

    Science.gov (United States)

    Aifat, N. R.; Yaakop, S.; Md-Zain, B. M.

    2016-11-01

    The IUCN Red List of Threatened Species has categorized Malaysian primates from being data deficient to critically endanger. Thus, ancient DNA analyses hold great potential to understand phylogeny, phylogeography and population history of extinct and extant species. Museum samples are one of the alternatives to provide important sources of biological materials for a large proportion of ancient DNA studies. In this study, a total of six museum skin samples from species Presbytis hosei (4 samples) and Presbytis frontata (2 samples), aged between 43 and 124 years old were extracted to obtain the DNA. Extraction was done by using QIAGEN QIAamp DNA Investigator Kit and the ability of this kit to extract museum skin samples was tested by amplification of partial Cyt b sequence using species-specific designed primer. Two primer pairs were designed specifically for P. hosei and P. frontata, respectively. These primer pairs proved to be efficient in amplifying 200bp of the targeted species in the optimized PCR conditions. The performance of the sequences were tested to determine genetic distance of genus Presbytis in Malaysia. From the analyses, P. hosei is closely related to P. chrysomelas and P. frontata with the value of 0.095 and 0.106, respectively. Cyt b gave a clear data in determining relationships among Bornean species. Thus, with the optimized condition, museum specimens can be used for molecular systematic studies of the Malaysian primates.

  19. Localisation of nursery areas based on comparative analyses of the horizontal and vertical distribution patterns of juvenile Baltic cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Lundgren, Bo; Kristensen, Kasper

    2013-01-01

    Baltic cod are determined, and their nursery areas are localised according to the environmental factors affecting them. Comparative statistical analyses of biological, hydrographic and hydroacoustic data are carried out based on standard ICES demersal trawl surveys and special integrated trawl...... and acoustic research surveys. Horizontal distribution maps for the 2001–2010 cohorts of juvenile cod are further generated by applying a statistical log-Gaussian Cox process model to the standard trawl survey data. The analyses indicate size-dependent horizontal and distinct vertical and diurnal distribution...... in deep sea localities down to a 100 m depth and at oxygen concentrations between 2–4 ml O2.l−1. The vertical, diurnally stratified and repeated trawling and hydroacoustic target strength-depth distributions obtained from the special surveys show juvenile cod concentrations in frontal zone water layers...

  20. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  1. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    Science.gov (United States)

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  2. University Students' Knowledge Structures and Informal Reasoning on the Use of Genetically Modified Foods: Multidimensional Analyses

    Science.gov (United States)

    Wu, Ying-Tien

    2013-10-01

    This study aims to provide insights into the role of learners' knowledge structures about a socio-scientific issue (SSI) in their informal reasoning on the issue. A total of 42 non-science major university students' knowledge structures and informal reasoning were assessed with multidimensional analyses. With both qualitative and quantitative analyses, this study revealed that those students with more extended and better-organized knowledge structures, as well as those who more frequently used higher-order information processing modes, were more oriented towards achieving a higher-level informal reasoning quality. The regression analyses further showed that the "richness" of the students' knowledge structures explained 25 % of the variation in their rebuttal construction, an important indicator of reasoning quality, indicating the significance of the role of students' sophisticated knowledge structure in SSI reasoning. Besides, this study also provides some initial evidence for the significant role of the "core" concept within one's knowledge structure in one's SSI reasoning. The findings in this study suggest that, in SSI-based instruction, science instructors should try to identify students' core concepts within their prior knowledge regarding the SSI, and then they should try to guide students to construct and structure relevant concepts or ideas regarding the SSI based on their core concepts. Thus, students could obtain extended and well-organized knowledge structures, which would then help them achieve better learning transfer in dealing with SSIs.

  3. Formalisation des bases méthodologiques et conceptuelles d'une analyse spatiale des accidents de la route

    Directory of Open Access Journals (Sweden)

    Florence Huguenin Richard

    1999-06-01

    Full Text Available Cet article pose les bases méthodologiques et conceptuelles d’une analyse spatiale du risque routier. L’étude de ce phénomène requiert une masse importante de données qui décrivent différentes dimensions de l’accident et qui peuvent être gérées dans un système d’information géographique. Elle demande aussi une réflexion méthodologique sur la cartographie du risque, les échelles d’observation, l’agrégation de données qualitatives et quantitatives, l’utilisation de méthodes statistiques adaptées au risque routier et l'intégration de l’espace comme facteur d’insécurité.

  4. Pathways from fertility history to later life health: Results from analyses of the English Longitudinal Study of Ageing

    Directory of Open Access Journals (Sweden)

    Emily Grundy

    2015-01-01

    Full Text Available Background: Previous research shows associations between fertility histories and later life health. The childless, those with large families, and those with a young age at entry to parenthood generally have higher mortality and worse health than parents of two or three children. These associations are hypothesised to reflect a range of biosocial influences, but underlying mechanisms are poorly understood. Objective: To identify pathways from fertility histories to later life health by examining mediation through health-related behaviours, social support and strain, and wealth. Additionally to examine mediation through allostatic load - an indicator of multisystem physical dysregulation, hypothesised to be an outcome of chronic stress. Methods: Associations between fertility histories, mediators, and outcomes were analysed using path models. Data were drawn from the English Longitudinal Study of Ageing. Outcomes studied were a measure of allostatic load based on 9 biomarkers and self-reported long-term illness which limited activities. Results: Early parenthood (Conclusions: In England early parenthood and larger family size are associated with less wealth and poorer health behaviours and this accounts for much of the association with health. At least part of this operates through stress-related physiological dysfunction (allostatic load.

  5. Mechanical analyses on the digital behaviour of the Tokay gecko (Gekko gecko) based on a multi-level directional adhesion model

    OpenAIRE

    Wu, Xuan; Wang, Xiaojie; Mei, Tao; Sun, Shaoming

    2015-01-01

    This paper proposes a multi-level hierarchical model for the Tokay gecko (Gekko gecko) adhesive system and analyses the digital behaviour of the G. gecko under macro/meso-level scale. The model describes the structures of G. gecko's adhesive system from the nano-level spatulae to the sub-millimetre-level lamella. The G. gecko's seta is modelled using inextensible fibril based on Euler's elastica theorem. Considering the side contact of the spatular pads of the seta on the flat and rigid subst...

  6. Use of flow models to analyse loss of coolant accidents

    International Nuclear Information System (INIS)

    Pinet, Bernard

    1978-01-01

    This article summarises current work on developing the use of flow models to analyse loss-of-coolant accident in pressurized-water plants. This work is being done jointly, in the context of the LOCA Technical Committee, by the CEA, EDF and FRAMATOME. The construction of the flow model is very closely based on some theoretical studies of the two-fluid model. The laws of transfer at the interface and at the wall are tested experimentally. The representativity of the model then has to be checked in experiments involving several elementary physical phenomena [fr

  7. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  8. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette

    2016-01-01

    PURPOSE: An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream...

  9. Development of generic soil profiles and soil data development for SSI analyses

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Josh, E-mail: jparker@nuscalepower.com [NuScale Power, 1000 NE Circle Boulevard, Suite 10310, Corvallis, OR 97330 (United States); Khan, Mohsin; Rajagopal, Raj [ARES Corporation, 1990N California Boulevard, Suite 500, Walnut Creek, CA 94596 (United States); Groome, John [NuScale Power, 1000 NE Circle Boulevard, Suite 10310, Corvallis, OR 97330 (United States)

    2014-04-01

    This paper presents the approach to developing generic soil profiles for the design of reactor building for small modular reactor (SMR) nuclear power plant developed by NuScale Power. The reactor building is a deeply embedded structure. In order to perform soil structure interaction (SSI) analyses, generic soil profiles are required to be defined for the standardized Nuclear Power Plant (NPP) designs for the United States Nuclear Regulatory Commission (NRC) in a design control document (DCD). The development of generic soil profiles is based on utilization of information on generic soil profiles from the new standardized nuclear power plant designs already submitted to the NRC for license certification. Eleven generic soil profiles have been recommended, and those profiles cover a wide range of parameters such as soil depth, shear wave velocity, unit weight, Poisson's ratio, water table, and depth to rock strata. The soil profiles are developed for a range of shear wave velocities between bounds of 1000 fps and 8000 fps as inferred from NRC Standard Review Plan (NUREG 0800) Sections 3.7.1 and 3.7.2. To account for the soil degradation due to seismic events, the strain compatible soil properties are based on the EPRI generic soil degradation curves. In addition, one dimensional soil dynamic response analyses were performed to study the soil layer input motions for performing the SSI analyses.

  10. Theoretical study for a digital transfer function analyser; Etude theorique pour un transferometre digital

    Energy Technology Data Exchange (ETDEWEB)

    Freycenon, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    This study deals with the harmonic analysis of the instantaneous counting rate of a pulse train. This arises from using a fission chamber for reactivity to power transfer function measurements by oscillation methods in reactors. The systematical errors due to the sampling process are computed. The integration carried out when sampling the signal modifies the formulae of the Nyquist theorem on spectrum folding. The statistical errors due to the noise are analysed: it is shown that the bandwidth of the spectral window applied to the noise frequency spectrum is equal to the inverse of the time duration of the experiment. A dead time of 25 per cent of the sampling time does not increase appreciably the bandwidth. A new method is proposed afterwards yielding very approximate results of the Fourier analysis during the experiment. The systematical errors arising from the measuring process are determined, and it is shown that the bandwidth of the corresponding spectral window is still given by the inverse of the time duration of the experiment. (author) [French] Cette etude se rapporte a l'analyse harmonique de la valeur instantanee du taux de comptage d'une suite d'impulsions. On rencontre ce probleme dans l'utilisation de chambres a fission pour les mesures de fonction de transfert reactivite-puissance par la methode d'oscillation dans les piles. On calcule l'erreur systematique due au processus d'echantillonnage ou l'integration operee modifie les formules classiques de recouvrement du spectre. On analyse ensuite les erreurs statistiques dues au bruit de fond. On montre que la largeur de bande de la fenetre spectrale appliquee au spectre de puissance du bruit est donnee par l'inverse du temps de mesure. Un temps mort de 25 pour cent du temps de prelevement n'accroit pas sensiblement cette largeur de bande. On propose ensuite un procede simple qui permet d'obtenir, en cours d'experience, des resultats tres approches de l'analyse de Fourier. On determine les erreurs

  11. Feasibility study on AFR-100 fuel conversion from uranium-based fuel to thorium-based fuel

    Energy Technology Data Exchange (ETDEWEB)

    Heidet, F.; Kim, T.; Grandy, C. (Nuclear Engineering Division)

    2012-07-30

    Although thorium has long been considered as an alternative to uranium-based fuels, most of the reactors built to-date have been fueled with uranium-based fuel with the exception of a few reactors. The decision to use uranium-based fuels was initially made based on the technology maturity compared to thorium-based fuels. As a result of this experience, lot of knowledge and data have been accumulated for uranium-based fuels that made it the predominant nuclear fuel type for extant nuclear power. However, following the recent concerns about the extent and availability of uranium resources, thorium-based fuels have regained significant interest worldwide. Thorium is more abundant than uranium and can be readily exploited in many countries and thus is now seen as a possible alternative. As thorium-based fuel technologies mature, fuel conversion from uranium to thorium is expected to become a major interest in both thermal and fast reactors. In this study the feasibility of fuel conversion in a fast reactor is assessed and several possible approaches are proposed. The analyses are performed using the Advanced Fast Reactor (AFR-100) design, a fast reactor core concept recently developed by ANL. The AFR-100 is a small 100 MW{sub e} reactor developed under the US-DOE program relying on innovative fast reactor technologies and advanced structural and cladding materials. It was designed to be inherently safe and offers sufficient margins with respect to the fuel melting temperature and the fuel-cladding eutectic temperature when using U-10Zr binary metal fuel. Thorium-based metal fuel was preferred to other thorium fuel forms because of its higher heavy metal density and it does not need to be alloyed with zirconium to reduce its radiation swelling. The various approaches explored cover the use of pure thorium fuel as well as the use of thorium mixed with transuranics (TRU). Sensitivity studies were performed for the different scenarios envisioned in order to determine the

  12. The impact of clinical trial design on cost-effectiveness analyses: illustration from a published study of the one-touch ultrasmart blood glucose meter for insulin-using diabetes patients.

    Science.gov (United States)

    Tunis, Sandra L; Minshall, Michael E

    2008-06-01

    One source of variation in cost-effectiveness analyses stems from the characteristics of the study upon which each is based. This report provides cost-effectiveness analyses using data from a recently published randomized clinical trial (RCT) comparing an integrated glucose meter/electronic logbook to a conventional glucose meter/paper logbook in helping to control hemoglobin A1c in type 1 or type 2 diabetes. RCT participants and health care professionals (HCPs) were "blinded" to results of meter downloads until week 16, when participants chose systems. They returned to "usual care" and could obtain meter results and share them with their HCPs. Those eligible returned 26-65 weeks later for an observational visit. The CORE Diabetes Model was used to estimate the 60-year cost-effectiveness of the electronic (vs. conventional) meter. With no price premium, the newer technology represented a dominant strategy (greater effectiveness/lower costs) based on the RCT alone or on the RCT + observational visit. With a $100.00/year premium, the incremental cost-effectiveness ratio was $28,053 based on the RCT, but the electronic monitor was dominant when simulations included observational visit results. One plausible reason for the greater benefits of the electronic monitor with the observational period included was the ability of patients and HCPs to make better clinical and lifestyle modifications based on fully available, formatted data. Because the advantages of the electronic meter are based on timely access to accurate feedback, the importance of naturalistic, unblinded studies for technology assessments can be appreciated. Addressing the methodological issues discussed here can help integrate clinical and economic outcomes for diabetes care innovations.

  13. A non-equilibrium neutral model for analysing cultural change.

    Science.gov (United States)

    Kandler, Anne; Shennan, Stephen

    2013-08-07

    Neutral evolution is a frequently used model to analyse changes in frequencies of cultural variants over time. Variants are chosen to be copied according to their relative frequency and new variants are introduced by a process of random mutation. Here we present a non-equilibrium neutral model which accounts for temporally varying population sizes and mutation rates and makes it possible to analyse the cultural system under consideration at any point in time. This framework gives an indication whether observed changes in the frequency distributions of a set of cultural variants between two time points are consistent with the random copying hypothesis. We find that the likelihood of the existence of the observed assemblage at the end of the considered time period (expressed by the probability of the observed number of cultural variants present in the population during the whole period under neutral evolution) is a powerful indicator of departures from neutrality. Further, we study the effects of frequency-dependent selection on the evolutionary trajectories and present a case study of change in the decoration of pottery in early Neolithic Central Europe. Based on the framework developed we show that neutral evolution is not an adequate description of the observed changes in frequency. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Factors Associated with Remission of Eczema in Children: a Population-based Follow-up Study.

    OpenAIRE

    von Kobyletzki, Laura; Bornehag, Carl-Gustaf; Breeze, Elizabeth; Larsson, Malin; Boman Lindström, Cecilia; Svensson, Åke

    2014-01-01

    The aim of this study was to analyse factors associated with remission of atopic dermatitis (AD) in childhood. A population-based AD cohort of 894 children aged 1-3 years from a cross-sectional baseline study in 2000 was followed up in 2005. The association between remission, background, health, lifestyle, and environmental variables was estimated with crude and multivariable logistic regression. At follow-up, 52% of the children had remission. Independent factors at baseline predicting remis...

  15. Prediction of Seismic Slope Displacements by Dynamic Stick-Slip Analyses

    International Nuclear Information System (INIS)

    Ausilio, Ernesto; Costanzo, Antonio; Silvestri, Francesco; Tropeano, Giuseppe

    2008-01-01

    A good-working balance between simplicity and reliability in assessing seismic slope stability is represented by displacement-based methods, in which the effects of deformability and ductility can be either decoupled or coupled in the dynamic analyses. In this paper, a 1D lumped mass ''stick-slip'' model is developed, accounting for soil heterogeneity and non-linear behaviour, with a base sliding mechanism at a potential rupture surface. The results of the preliminary calibration show a good agreement with frequency-domain site response analysis in no-slip conditions. The comparison with rigid sliding block analyses and with the decoupled approach proves that the stick-slip procedure can result increasingly unconservative for soft soils and deep sliding depths

  16. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  17. Chemical and magnetic analyses on tree bark as an effective tool for biomonitoring: A case study in Lisbon (Portugal).

    Science.gov (United States)

    Brignole, Daniele; Drava, Giuliana; Minganti, Vincenzo; Giordani, Paolo; Samson, Roeland; Vieira, Joana; Pinho, Pedro; Branquinho, Cristina

    2018-03-01

    Tree bark has proven to be a reliable tool for biomonitoring deposition of metals from the atmosphere. The aim of the present study was to test if bark magnetic properties can be used as a proxy of the overall metal loads of a tree bark, meaning that this approach can be used to discriminate different effects of pollution on different types of urban site. In this study, the concentrations of As, Cd, Co, Cu, Fe, Mn, Ni, P, Pb, V and Zn were measured by ICP-OES in bark samples of Jacaranda mimosifolia, collected along roads and in urban green spaces in the city of Lisbon (Portugal). Magnetic analyses were also performed on the same bark samples, measuring Isothermal Remanent Magnetization (IRM), Saturation Isothermal Remanent Magnetization (SIRM) and Magnetic Susceptibility (χ). The results confirmed that magnetic analyses can be used as a proxy of the overall load of trace elements in tree bark, and could be used to distinguish different types of urban sites regarding atmospheric pollution. Together with trace element analyses, magnetic analyses could thus be used as a tool to provide high-resolution data on urban air quality and to follow up the success of mitigation actions aiming at decreasing the pollutant load in urban environments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Shinguards effective in preventing lower leg injuries in football: Population-based trend analyses over 25 years.

    Science.gov (United States)

    Vriend, Ingrid; Valkenberg, Huib; Schoots, Wim; Goudswaard, Gert Jan; van der Meulen, Wout J; Backx, Frank J G

    2015-09-01

    The majority of football injuries are caused by trauma to the lower extremities. Shinguards are considered an important measure in preventing lower leg impact abrasions, contusions and fractures. Given these benefits, Fédération Internationale de Football Association introduced the shinguard law in 1990, which made wearing shinguards during matches mandatory. This study evaluated the effect of the introduction of the shinguard law for amateur players in the Netherlands in the 1999/2000-football season on the incidence of lower leg injuries. Time trend analyses on injury data covering 25 years of continuous registration (1986-2010). Data were retrieved from a system that records all emergency department treatments in a random, representative sample of Dutch hospitals. All injuries sustained in football by patients aged 6-65 years were included, except for injuries of the Achilles tendon and Weber fractures. Time trends were analysed with multiple regression analyses; a model was fitted consisting of multiple straight lines, each representing a 5-year period. Patients were predominantly males (92%) and treated for fractures (48%) or abrasions/contusions (52%) to the lower leg. The incidence of lower leg football injuries decreased significantly following the introduction of the shinguard law (1996-2000: -20%; 2001-2005: -25%), whereas the incidence of all other football injuries did not. This effect was more prominent at weekends/match days. No gender differences were found. The results significantly show a preventive effect of the shinguard law underlining the relevance of rule changes as a preventive measure and wearing shinguards during both matches and training sessions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Experimental technique of stress analyses by neutron diffraction

    International Nuclear Information System (INIS)

    Sun, Guangai; Chen, Bo; Huang, Chaoqiang

    2009-09-01

    The structures and main components of neutron diffraction stress analyses spectrometer, SALSA, as well as functions and parameters of each components are presented. The technical characteristic and structure parameters of SALSA are described. Based on these aspects, the choice of gauge volume, method of positioning sample, determination of diffraction plane and measurement of zero stress do are discussed. Combined with the practical experiments, the basic experimental measurement and the related settings are introduced, including the adjustments of components, pattern scattering, data recording and checking etc. The above can be an instruction for stress analyses experiments by neutron diffraction and neutron stress spectrometer construction. (authors)

  20. Cycle O(CY1991) NLS trade studies and analyses report. Book 2, part 2: Propulsion

    Science.gov (United States)

    Cronin, R.; Werner, M.; Bonson, S.; Spring, R.; Houston, R.

    1992-01-01

    This report documents the propulsion system tasks performed in support of the National Launch System (NLS) Cycle O preliminary design activities. The report includes trades and analyses covering the following subjects: (1) Maximum Tank Stretch Study; (2) No LOX Bleed Performance Analysis; (3) LOX Bleed Trade Study; (4) LO2 Tank Pressure Limits; (5) LOX Tank Pressurization System Using Helium; (6) Space Transportation Main Engine (STME) Heat Exchanger Performance; (7) LH2 Passive Recirculation Performance Analysis; (8) LH2 Bleed/Recirculation Study; (9) LH2 Tank Pressure Limits; and (10) LH2 Pressurization System. For each trade study an executive summary and a detailed trade study are provided. For the convenience of the reader, a separate section containing a compilation of only the executive summaries is also provided.

  1. The shaping of environmental concern in product chains: analysing Danish case studies on environmental aspects in product chain relations

    DEFF Research Database (Denmark)

    Forman, Marianne; Hansen, Anne Grethe; Jørgensen, Michael Søgaard

    indirect demand for greening activities. The analysis shows the co-construction of environmental concerns and demands, companies’ environmental practices and technological developments, and their stabilisation in the supply chain. The case studies also point to how the greening of frontrunners might make...... the systems of production, consumption, knowledge and regulation are discussed. The role of boundary objects is discussed with eco-labelling as case. The role of and the impact on the product chain relations are analysed as part of these mechanisms. From the case studies, green innovations in the product...... chain, which the case company represents, are identified. Direct customer and regulatory demands, as well as indirect societal and regulatory demands are mapped, and their role for product chain greening analysed. The case studies point to the importance of customer demand, regulation and potentially...

  2. Ain't necessarily so: review and critique of recent meta-analyses of behavioral medicine interventions in health psychology.

    Science.gov (United States)

    Coyne, James C; Thombs, Brett D; Hagedoorn, Mariet

    2010-03-01

    We examined four meta-analyses of behavioral interventions for adults (Dixon, Keefe, Scipio, Perri, & Abernethy, 2007; Hoffman, Papas, Chatkoff, & Kerns, 2007; Irwin, Cole, & Nicassio, 2006; and Jacobsen, Donovan, Vadaparampil, & Small, 2007) that have appeared in the Evidence Based Treatment Reviews section of Health Psychology. Narrative review. We applied the following criteria to each meta-analysis: (1) whether each meta-analysis was described accurately, adequately, and transparently in the article; (2) whether there was an adequate attempt to deal with methodological quality of the original trials; (3) the extent to which the meta-analysis depended on small, underpowered studies; and (4) the extent to which the meta-analysis provided valid and useful evidence-based recommendations. Across the four meta-analyses, we identified substantial problems with the transparency and completeness with which these meta-analyses were reported, as well as a dependence on small, underpowered trials of generally poor quality. Results of our exercise raise questions about the clinical validity and utility of the conclusions of these meta-analyses. Results should serve as a wake up call to prospective authors, reviewers, and end-users of meta-analyses now appearing in the literature. Copyright 2010 APA, all rights reserved.

  3. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  4. Study on frequency characteristics of wireless power transmission system based on magnetic coupling resonance

    Science.gov (United States)

    Liang, L. H.; Liu, Z. Z.; Hou, Y. J.; Zeng, H.; Yue, Z. K.; Cui, S.

    2017-11-01

    In order to study the frequency characteristics of the wireless energy transmission system based on the magnetic coupling resonance, a circuit model based on the magnetic coupling resonant wireless energy transmission system is established. The influence of the load on the frequency characteristics of the wireless power transmission system is analysed. The circuit coupling theory is used to derive the minimum load required to suppress frequency splitting. Simulation and experimental results verify that when the load size is lower than a certain value, the system will appear frequency splitting, increasing the load size can effectively suppress the frequency splitting phenomenon. The power regulation scheme of the wireless charging system based on magnetic coupling resonance is given. This study provides a theoretical basis for load selection and power regulation of wireless power transmission systems.

  5. Not all risks are created equal: A twin study and meta-analyses of risk taking across seven domains.

    Science.gov (United States)

    Wang, X T Xiao-Tian; Zheng, Rui; Xuan, Yan-Hua; Chen, Jie; Li, Shu

    2016-11-01

    Humans routinely deal with both traditional and novel risks. Different kinds of risks have been a driving force for both evolutionary adaptations and personal development. This study explored the genetic and environmental influences on human risk taking in different task domains. Our approach was threefold. First, we integrated several scales of domain-specific risk-taking propensity and developed a synthetic scale, including both evolutionarily typical and modern risks in the following 7 domains: cooperation/competition, safety, reproduction, natural/physical risk, moral risk, financial risk, and gambling. Second, we conducted a twin study using the scale to estimate the contributions of genes and environment to risk taking in each of these 7 domains. Third, we conducted a series of meta-analyses of extant twin studies across the 7 risk domains. The results showed that individual differences in risk-taking propensity and its consistency across domains were mainly regulated by additive genetic influences and individually unique environmental experiences. The heritability estimates from the meta-analyses ranged from 29% in financial risk taking to 55% in safety. Supporting the notion of risk-domain specificity, both the behavioral and genetic correlations among the 7 domains were generally low. Among the relatively few correlations between pairs of risk domains, our analysis revealed a common genetic factor that regulates moral, financial, and natural/physical risk taking. This is the first effort to separate genetic and environmental influences on risk taking across multiple domains in a single study and integrate the findings of extant twin studies via a series of meta-analyses conducted in different task domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  7. Survival rate of breast cancer patients in Malaysia: a population-based study.

    Science.gov (United States)

    Abdullah, Nor Aini; Wan Mahiyuddin, Wan Rozita; Muhammad, Nor Asiah; Ali, Zainudin Mohamad; Ibrahim, Lailanor; Ibrahim Tamim, Nor Saleha; Mustafa, Amal Nasir; Kamaluddin, Muhammad Amir

    2013-01-01

    Breast cancer is the most common cancer among Malaysian women. Other than hospital-based results, there are no documented population-based survival rates of Malaysian women for breast cancers. This population- based retrospective cohort study was therefore conducted. Data were obtained from Health Informatics Centre, Ministry of Health Malaysia, National Cancer Registry and National Registration Department for the period from 1st Jan 2000 to 31st December 2005. Cases were captured by ICD-10 and linked to death certificates to identify the status. Only complete data were analysed. Survival time was calculated from the estimated date of diagnosis to the date of death or date of loss to follow-up. Observed survival rates were estimated by Kaplan- Meier method using SPSS Statistical Software version 17. A total of 10,230 complete data sets were analysed. The mean age at diagnosis was 50.6 years old. The overall 5-year survival rate was 49% with median survival time of 68.1 months. Indian women had a higher survival rate of 54% compared to Chinese women (49%) and Malays (45%). The overall 5-year survival rate of breast cancer patient among Malaysian women was still low for the cohort of 2000 to 2005 as compared to survival rates in developed nations. Therefore, it is necessary to enhance the strategies for early detection and intervention.

  8. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Weber, T., E-mail: thomas.weber@physik.uni-erlangen.de [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Bartl, P.; Durst, J. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Haas, W. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); University of Erlangen-Nuremberg, Pattern Recognition Lab, Martensstr. 3, 91058 Erlangen (Germany); Michel, T.; Ritter, A.; Anton, G. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany)

    2011-08-21

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary. With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases. These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool 'SPHINX', combining both wave and particle contributions of the simulated photons. The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant. Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements. This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  9. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  10. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies

    Science.gov (United States)

    Oliveiros, Bárbara

    2016-01-01

    evidence that the (right) amygdala responds preferentially to untrustworthy faces. Moreover, two ALE analyses performed with 6 articles (7 studies) identified the amygdala, insula and medial dorsal nuclei of thalamus as structures with negative correlation with trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial trustworthiness levels. Significant effects considering subgroup analysis based on methodological criteria were found for experiments using spatial smoothing, categorization of trustworthiness in 2 or 3 categories and paradigms which involve both explicit and implicit tasks. Limitations Significant heterogeneity between studies was found in MA, which might have arisen from inclusion of studies with smaller sample sizes and differences in methodological options. Studies using ROI analysis / small volume correction methods were more often devoted specifically to the amygdala region, with some results reporting uncorrected p-values based on mainly clinical a priori evidence of amygdala involvement in these processes. Nevertheless, we did not find significant evidence for publication bias. Conclusions and implications of key findings Our results support the role of amygdala in facial trustworthiness judgment, emphasizing its predominant role during processing of negative social signals in (untrustworthy) faces. This systematic review suggests that little consistency exists among studies’ methodology, and that larger sample sizes should be preferred. PMID:27898705

  11. Tolerance analyses of a quadrupole magnet for advanced photon source upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Liu, J., E-mail: Jieliu@aps.anl.gov; Jaski, M., E-mail: jaski@aps.anl.gov; Borland, M., E-mail: borland@aps.anl.gov [Advanced Photon Source, Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL60439 (United States); Jain, A., E-mail: jain@bnl.gov [Superconducting Magnet Division, Brookhaven National Laboratory, P.O. Box 5000. Upton, NY 11973-5000 (United States)

    2016-07-27

    Given physics requirements, the mechanical fabrication and assembly tolerances for storage ring magnets can be calculated using analytical methods [1, 2]. However, this method is not easy for complicated magnet designs [1]. In this paper, a novel method is proposed to determine fabrication and assembly tolerances consistent with physics requirements, through a combination of magnetic and mechanical tolerance analyses. In this study, finite element analysis using OPERA is conducted to estimate the effect of fabrication and assembly errors on the magnetic field of a quadrupole magnet and to determine the allowable tolerances to achieve the specified magnetic performances. Based on the study, allowable fabrication and assembly tolerances for the quadrupole assembly are specified for the mechanical design of the quadrupole magnet. Next, to achieve the required assembly level tolerances, mechanical tolerance stackup analyses using a 3D tolerance analysis package are carried out to determine the part and subassembly level fabrication tolerances. This method can be used to determine the tolerances for design of other individual magnets and of magnet strings.

  12. Tolerance analyses of a quadrupole magnet for advanced photon source upgrade

    International Nuclear Information System (INIS)

    Liu, J.; Jaski, M.; Borland, M.; Jain, A.

    2016-01-01

    Given physics requirements, the mechanical fabrication and assembly tolerances for storage ring magnets can be calculated using analytical methods [1, 2]. However, this method is not easy for complicated magnet designs [1]. In this paper, a novel method is proposed to determine fabrication and assembly tolerances consistent with physics requirements, through a combination of magnetic and mechanical tolerance analyses. In this study, finite element analysis using OPERA is conducted to estimate the effect of fabrication and assembly errors on the magnetic field of a quadrupole magnet and to determine the allowable tolerances to achieve the specified magnetic performances. Based on the study, allowable fabrication and assembly tolerances for the quadrupole assembly are specified for the mechanical design of the quadrupole magnet. Next, to achieve the required assembly level tolerances, mechanical tolerance stackup analyses using a 3D tolerance analysis package are carried out to determine the part and subassembly level fabrication tolerances. This method can be used to determine the tolerances for design of other individual magnets and of magnet strings.

  13. Do regional methods really help reduce uncertainties in flood frequency analyses?

    Science.gov (United States)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged

  14. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  15. Multi-person and multi-attribute design evaluations using evidential reasoning based on subjective safety and cost analyses

    International Nuclear Information System (INIS)

    Wang, J.; Yang, J.B.; Sen, P.

    1996-01-01

    This paper presents an approach for ranking proposed design options based on subjective safety and cost analyses. Hierarchical system safety analysis is carried out using fuzzy sets and evidential reasoning. This involves safety modelling by fuzzy sets at the bottom level of a hierarchy and safety synthesis by evidential reasoning at higher levels. Fuzzy sets are also used to model the cost incurred for each design option. An evidential reasoning approach is then employed to synthesise the estimates of safety and cost, which are made by multiple designers. The developed approach is capable of dealing with problems of multiple designers, multiple attributes and multiple design options to select the best design. Finally, a practical engineering example is presented to demonstrate the proposed multi-person and multi-attribute design selection approach

  16. Analyses of the essential oil from Bunium persicum fruit and its antioxidant constituents.

    Science.gov (United States)

    Nickavar, Bahman; Adeli, Abrisham; Nickavar, Azar

    2014-01-01

    This study was aimed to analyze and identify the antioxidant constituents of the essential oil of Bunium persicum (Apiaceae) fruit. The essential oil was obtained by hydrodistillation and analyses by GC-FID and GC-MS. The essential oil was tested for antioxidant capacity in DPPH radical scavenging and linoleic acid/β-carotene assays. The TLC-bioautography method based on DPPH radical assay and GC analyses were carried out to characterize the major antioxidant compounds in the essential oil. GC analyses showed the presence of sixteen compounds with p-cymene (31.1%), cuminaldehyde (22.2%), and γ-terpinene (11.4%) as the main components in the essential oil. The oil exhibited good radical scavenging [IC50 (DPPH·) = 4.47 (3.96 - 5.05) mg/mL] and antilipid peroxidation [IC50 (β-carotene bleaching) = 0.22 (0.16 - 0.31) mg/mL] activities. The TLC tests resulted in identification of cuminaldehyde, p-cymene-7-ol, and cuminyl acetate as the main constituents of the active oil fraction.

  17. The Network for Analysing Longitudinal Population-based HIV/AIDS data on Africa (ALPHA): Data on mortality, by HIV status and stage on the HIV care continuum, among the general population in seven longitudinal studies between 1989 and 2014.

    Science.gov (United States)

    Slaymaker, Emma; McLean, Estelle; Wringe, Alison; Calvert, Clara; Marston, Milly; Reniers, Georges; Kabudula, Chodziwadziwa Whiteson; Crampin, Amelia; Price, Alison; Michael, Denna; Urassa, Mark; Kwaro, Daniel; Sewe, Maquins; Eaton, Jeffrey W; Rhead, Rebecca; Nakiyingi-Miiro, Jessica; Lutalo, Tom; Nabukalu, Dorean; Herbst, Kobus; Hosegood, Victoria; Zaba, Basia

    2017-11-06

    Timely progression of people living with HIV (PLHIV) from the point of infection through the pathway from diagnosis to treatment is important in ensuring effective care and treatment of HIV and preventing HIV-related deaths and onwards transmission of infection.  Reliable, population-based estimates of new infections are difficult to obtain for the generalised epidemics in sub-Saharan Africa.  Mortality data indicate disease burden and, if disaggregated along the continuum from diagnosis to treatment, can also reflect the coverage and quality of different HIV services.  Neither routine statistics nor observational clinical studies can estimate mortality prior to linkage to care nor following disengagement from care.  For this, population-based data are required. The Network for Analysing Longitudinal Population-based HIV/AIDS data on Africa brings together studies in Kenya, Malawi, South Africa, Tanzania, Uganda, and Zimbabwe.  Eight studies have the necessary data to estimate mortality by HIV status, and seven can estimate mortality at different stages of the HIV care continuum.  This data note describes a harmonised dataset containing anonymised individual-level information on survival by HIV status for adults aged 15 and above. Among PLHIV, the dataset provides information on survival during different periods: prior to diagnosis of infection; following diagnosis but before linkage to care; in pre-antiretroviral treatment (ART) care; in the first six months after ART initiation; among people continuously on ART for 6+ months; and among people who have ever interrupted ART.

  18. The Effects of Discourses in Regional Contexts on the Development of Curriculum-Based Literacy Standards for Adolescents in Schooling: A Comparative Study of South Australia and Ontario

    Science.gov (United States)

    Fenwick, Lisl

    2017-01-01

    This study analyses how discourses in regional contexts affect the development of curriculum-based literacy standards for adolescents in schooling. A comparative case-study research design enabled the influences of discourses at the regional level to be analysed. The case studies include the development of curricula to define a minimum literacy…

  19. Acquisition, Analyses and Interpretation of fMRI Data: A Study on the Effective Connectivity in Human Primary Auditory Cortices

    International Nuclear Information System (INIS)

    Ahmad Nazlim Yusoff; Mazlyfarina Mohamad; Khairiah Abdul Hamid

    2011-01-01

    A study on the effective connectivity characteristics in auditory cortices was conducted on five healthy Malay male subjects with the age of 20 to 40 years old using functional magnetic resonance imaging (fMRI), statistical parametric mapping (SPM5) and dynamic causal modelling (DCM). A silent imaging paradigm was used to reduce the scanner sound artefacts on functional images. The subjects were instructed to pay attention to the white noise stimulus binaurally given at intensity level of 70 dB higher than the hearing level for normal people. Functional specialisation was studied using Matlab-based SPM5 software by means of fixed effects (FFX), random effects (RFX) and conjunction analyses. Individual analyses on all subjects indicate asymmetrical bilateral activation between the left and right auditory cortices in Brodmann areas (BA)22, 41 and 42 involving the primary and secondary auditory cortices. The three auditory areas in the right and left auditory cortices are selected for the determination of the effective connectivity by constructing 9 network models. The effective connectivity is determined on four out of five subjects with the exception of one subject who has the BA22 coordinates located too far from BA22 coordinates obtained from group analysis. DCM results showed the existence of effective connectivity between the three selected auditory areas in both auditory cortices. In the right auditory cortex, BA42 is identified as input centre with unidirectional parallel effective connectivities of BA42→BA41and BA42→BA22. However, for the left auditory cortex, the input is BA41 with unidirectional parallel effective connectivities of BA41→BA42 and BA41→BA22. The connectivity between the activated auditory areas suggests the existence of signal pathway in the auditory cortices even when the subject is listening to noise. (author)

  20. Activity Based Learning in a Freshman Global Business Course: Analyses of Preferences and Demographic Differences

    Science.gov (United States)

    Levine, Mark F.; Guy, Paul W.

    2007-01-01

    The present study investigates pre-business students' reaction to Activity Based Learning in a lower division core required course entitled Introduction to Global Business in the business curriculum at California State University Chico. The study investigates students' preference for Activity Based Learning in comparison to a more traditional…

  1. Financial relationships in economic analyses of targeted therapies in oncology.

    Science.gov (United States)

    Valachis, Antonis; Polyzos, Nikolaos P; Nearchou, Andreas; Lind, Pehr; Mauri, Davide

    2012-04-20

    A potential financial relationship between investigators and pharmaceutical manufacturers has been associated with an increased likelihood of reporting favorable conclusions about a sponsor's proprietary agent in pharmacoeconomic studies. The purpose of this study is to investigate whether there is an association between financial relationships and outcome in economic analyses of new targeted therapies in oncology. We searched PubMed (last update June 2011) for economic analyses of targeted therapies (including monoclonal antibodies, tyrosine-kinase inhibitors, and mammalian target of rapamycin inhibitors) in oncology. The trials were qualitatively rated regarding the cost assessment as favorable, neutral, or unfavorable on the basis of prespecified criteria. Overall, 81 eligible studies were identified. Economic analyses that were funded by pharmaceutical companies were more likely to report favorable qualitative cost estimates (28 [82%] of 34 v 21 [45%] of 47; P = .003). The presence of an author affiliated with manufacturer was not associated with study outcome. Furthermore, if only studies including a conflict of interest statement were included (66 of 81), studies that reported any financial relationship with manufacturers (author affiliation and/or funding and/or other financial relationship) were more likely to report favorable results of targeted therapies compared with studies without financial relationship (32 [71%] of 45 v nine [43%] of 21; P = .025). Our study reveals a potential threat for industry-related bias in economic analyses of targeted therapies in oncology in favor of analyses with financial relationships between authors and manufacturers. A more balanced funding of economic analyses from other sources may allow greater confidence in the interpretation of their results.

  2. Population-based cost-offset analyses for disorder-specific treatment of anorexia nervosa and bulimia nervosa in Germany.

    Science.gov (United States)

    Bode, Katharina; Götz von Olenhusen, Nina Maria; Wunsch, Eva-Maria; Kliem, Sören; Kröger, Christoph

    2017-03-01

    Previous research has shown that anorexia nervosa (AN) and bulimia nervosa (BN) are expensive illnesses to treat. To reduce their economic burden, adequate interventions need to be established. Our objective was to conduct cost-offset analyses for evidence-based treatment of eating disorders using outcome data from a psychotherapy trial involving cognitive behavioral therapy (CBT) and focal psychodynamic therapy (FPT) for AN and a trial involving CBT for BN. Assuming a currently running, ideal healthcare system using a 12-month, prevalence-based approach and varying the willingness to participate in treatment, we investigated whether the potential financial benefits of AN- and BN-related treatment outweigh the therapy costs at the population level. We elaborated on a formula that allows calculating cost-benefit relationships whereby the calculation of the parameters is based on estimates from data of health institutions within the German healthcare system. Additional intangible benefits were calculated with the aid of Quality-Adjusted Life Years. The annual costs of an untreated eating disorder were 2.38 billion EUR for AN and 617.69 million EUR for BN. Independent of the willingness to participate in treatment, the cost-benefit relationships for the treatment remained constant at 2.51 (CBT) and 2.33 (FPT) for AN and 4.05 (CBT) for BN. This consistency implies that for each EUR invested in the treatment, between 2.33 and 4.05 EUR could be saved each year. Our findings suggest that the implementation of evidence-based psychotherapy treatments for AN and BN may achieve substantial cost savings at the population level. © 2017 Wiley Periodicals, Inc.

  3. Smokers' increased risk for disability pension: social confounding or health-mediated effects? Gender-specific analyses of the Hordaland Health Study cohort.

    Science.gov (United States)

    Haukenes, Inger; Riise, Trond; Haug, Kjell; Farbu, Erlend; Maeland, John Gunnar

    2013-09-01

    Studies indicate that cigarette smokers have an increased risk for disability pension, presumably mediated by adverse health effects. However, smoking is also related to socioeconomic status. The current study examined the association between smoking and subsequent disability pension, and whether the association is explained by social confounding and/or health-related mediation. A subsample of 7934 men and 8488 women, aged 40-46, from the Hordaland Health Study, Norway (1997-1999), provided baseline information on smoking status, self-reported health measures and socioeconomic status. Outcome was register-based disability pension from 12 months after baseline to end of 2004. Gender stratified Cox regression analyses were used adjusted for socioeconomic status, physical activity, self-reported health and musculoskeletal pain sites. A total of 155 (2%) men and 333 (3.9%) women were granted disability pension during follow-up. The unadjusted disability risk associated with heavy smoking versus non-smoking was 1.88 (95% CI 1.23 to 2.89) among men and 3.06 (95% CI 2.23 to 4.20) among women. In multivariate analyses, adjusting for socioeconomic status, HRs were 1.33 (95% CI 0.84 to 2.11) among men and 2.22 (95% CI 1.58 to 3.13) among women. Final adjustment for physical activity, self-reported health and musculoskeletal pain further reduced the effect of heavy smoking in women (HR=1.53, 95% CI 1.09 to 2.16). Socioeconomic status confounded the smoking-related risk for disability pension; for female heavy smokers, however, a significant increased risk persisted after adjustment. Women may be particularly vulnerable to heavy smoking and to its sociomedical consequences, such as disability pension.

  4. Voxel-based analyses of gray/white matter volume and diffusion tensor data in major depression. Presidential award proceedings

    International Nuclear Information System (INIS)

    Abe, Osamu; Yamasue, Hidenori; Kasai, Kiyoto

    2008-01-01

    Previous neuroimaging studies have revealed that frontolimbic dysfunction may contribute to the pathophysiology of major depressive disorder. We used voxel-based analysis to simultaneously elucidate regional changes in gray/white matter volume, mean diffusivity (MD), and fractional anisotropy (FA) in the central nervous system of patients with unipolar major depression. We studied 21 right-handed patients and 42 age- and gender-matched right-handed normal subjects without central nervous system disorders. All image processing and statistical analyses were performed using SPM5 software. Local areas showing significant gray matter volume reduction in depressive patients compared with normal controls were observed in the right parahippocampal gyrus, hippocampus, bilateral middle frontal gyri, bilateral anterior cingulate cortices, left parietal and occipital lobes, and right superior temporal gyrus. Local areas showing increased mean diffusivity in depressive patients were observed in the bilateral parahippocampal gyri, hippocampus, pons, cerebellum, left frontal and temporal lobes, and right frontal lobe. There was no significant difference between the 2 groups for fractional anisotropy and white matter volume in the entire brain. Although there was no local area in which FA and MD were significantly correlated with disease severity, FA tended to correlate negatively with depression days (total accumulated days in depressive state) in the right anterior cingulate and the left frontal white matter (FDR-corrected P=0.055 for both areas). These results suggest that the frontolimbic neural circuit may play an important role in the neuropathology of patients with major depression. (author)

  5. Measuring social capital through multivariate analyses for the IQ-SC.

    Science.gov (United States)

    Campos, Ana Cristina Viana; Borges, Carolina Marques; Vargas, Andréa Maria Duarte; Gomes, Viviane Elisangela; Lucas, Simone Dutra; Ferreira e Ferreira, Efigênia

    2015-01-20

    Social capital can be viewed as a societal process that works toward the common good as well as toward the good of the collective based on trust, reciprocity, and solidarity. Our study aimed to present two multivariate statistical analyses to examine the formation of latent classes of social capital using the IQ-SC and to identify the most important factors in building an indicator of individual social capital. A cross-sectional study was conducted in 2009 among working adolescents supported by a Brazilian NGO. The sample consisted of 363 individuals, and data were collected using the World Bank Questionnaire for measuring social capital. First, the participants were grouped by a segmentation analysis using the Two Step Cluster method based on the Euclidian distance and the centroid criteria as the criteria for aggregate answers. Using specific weights for each item, discriminant analysis was used to validate the cluster analysis in an attempt to maximize the variance among the groups with respect to the variance within the clusters. "Community participation" and "trust in one's neighbors" contributed significantly to the development of the model with two distinct discriminant functions (p < 0.001). The majority of cases (95.0%) and non-cases (93.1%) were correctly classified by discriminant analysis. The two multivariate analyses (segmentation analysis and canonical discriminant analysis), used together, can be considered good choices for measuring social capital. Our results indicate that it is possible to form three social capital groups (low, medium and high) using the IQ-SC.

  6. Sequencing, Characterization, and Comparative Analyses of the Plastome of Caragana rosea var. rosea

    Directory of Open Access Journals (Sweden)

    Mei Jiang

    2018-05-01

    Full Text Available To exploit the drought-resistant Caragana species, we performed a comparative study of the plastomes from four species: Caragana rosea, C. microphylla, C. kozlowii, and C. Korshinskii. The complete plastome sequence of the C. rosea was obtained using the next generation DNA sequencing technology. The genome is a circular structure of 133,122 bases and it lacks inverted repeat. It contains 111 unique genes, including 76 protein-coding, 30 tRNA, and four rRNA genes. Repeat analyses obtained 239, 244, 258, and 246 simple sequence repeats in C. rosea, C. microphylla, C. kozlowii, and C. korshinskii, respectively. Analyses of sequence divergence found two intergenic regions: trnI-CAU-ycf2 and trnN-GUU-ycf1, exhibiting a high degree of variations. Phylogenetic analyses showed that the four Caragana species belong to a monophyletic clade. Analyses of Ka/Ks ratios revealed that five genes: rpl16, rpl20, rps11, rps7, and ycf1 and several sites having undergone strong positive selection in the Caragana branch. The results lay the foundation for the development of molecular markers and the understanding of the evolutionary process for drought-resistant characteristics.

  7. Estimating the effect of current, previous and never use of drugs in studies based on prescription registries

    DEFF Research Database (Denmark)

    Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms

    2009-01-01

    of this misclassification for analysing the risk of breast cancer. MATERIALS AND METHODS: Prescription data were obtained from Danish Registry of Medicinal Products Statistics and we applied various methods to approximate treatment episodes. We analysed the duration of HT episodes to study the ability to identify......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...

  8. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between ne...... information such as boundedness properties and liveness properties. The functionality of the simulation engine and state space facilities are similar to the corresponding components in Design/CPN, which is a widespread tool for Coloured Petri Nets.......CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  9. Public Librarians as Partners in Problem-Based Learning in Secondary Schools: A Case Study in Finland

    Science.gov (United States)

    Pietikäinen, Virpi; Kortelainen, Terttu; Siklander, Pirkko

    2017-01-01

    Introduction: Teachers in Finland are demanded to develop students' competencies in information literacy. However, they can meet this demand only by collaborating with public librarians. The aim in this case study was to explore the perspectives of teachers, librarians and students in a problem-based project and to analyse the advantages and…

  10. Health economic studies: an introduction to cost-benefit, cost-effectiveness, and cost-utility analyses.

    Science.gov (United States)

    Angevine, Peter D; Berven, Sigurd

    2014-10-15

    Narrative overview. To provide clinicians with a basic understanding of economic studies, including cost-benefit, cost-effectiveness, and cost-utility analyses. As decisions regarding public health policy, insurance reimbursement, and patient care incorporate factors other than traditional outcomes such as satisfaction or symptom resolution, health economic studies are increasingly prominent in the literature. This trend will likely continue, and it is therefore important for clinicians to have a fundamental understanding of the common types of economic studies and be able to read them critically. In this brief article, the basic concepts of economic studies and the differences between cost-benefit, cost-effectiveness, and cost-utility studies are discussed. An overview of the field of health economic analysis is presented. Cost-benefit, cost-effectiveness, and cost-utility studies all integrate cost and outcome data into a decision analysis model. These different types of studies are distinguished mainly by the way in which outcomes are valued. Obtaining accurate cost data is often difficult and can limit the generalizability of a study. With a basic understanding of health economic analysis, clinicians can be informed consumers of these important studies.

  11. Study of CP Violation in Dalitz-Plot Analyses of B-Meson Decays to Three Kaons

    Energy Technology Data Exchange (ETDEWEB)

    Lindquist, Brian [Stanford Univ., CA (United States)

    2012-02-01

    The Standard Model (SM) explains CP violation in terms of the CKM matrix. The BABAR experiment was designed mainly to test the CKM model in B decays. B decays that proceed through b → s loop diagrams, of which B {yields} KKK decays are an example, are sensitive to new physics effects that could lead to deviations from the CKM predictions for CP violation. We present studies of CP violation in the decays B+ → K+K-K+, B+ → KS0KS0K+, and B0 → K+K-KS0, using a Dalitz plot amplitude analysis. These studies are based on approximately 470 million B$\\bar{B}$ decays collected by BABAR at the PEP-II collider at SLAC. We perform measurements of time-dependent CP violation in B0 → K+K-KS0, including B0 → ΦKS0. We measure a CP-violating phase βeff (ΦKS0) = 0.36 ± 0.11 ± 0.04 rad., in agreement with the SM. This is the world's most precise measurement of this quantity. We also measure direct CP asymmetries in all three decay modes, including the direct CP asymmetry ACP (ΦK+) = (12.8 ± 4.4 ± 1.3)%, which is 2.8 sigma away from zero. This measurement is in tension with the SM, which predicts an asymmetry of a few percent. We also study the resonant and nonresonant features in the B → KKK Dalitz plots. We find that the hypothetical scalar fX(1500) resonance, introduced by prior analyses to explain an unknown peak in the mKK spectrum, cannot adequately describe the data. We conclude instead that the fX(1500) can be explained as the sum of the f0(1500), f'2(1525), and f0(1710) resonances, removing the need for the hypothetical fX(1500). We also find that an exponential

  12. Polyploidy in the Olive Complex (Olea europaea): Evidence from Flow Cytometry and Nuclear Microsatellite Analyses

    Science.gov (United States)

    Besnard, G.; Garcia-Verdugo, C.; Rubio De Casas, R.; Treier, U. A.; Galland, N.; Vargas, P.

    2008-01-01

    Background Phylogenetic and phylogeographic investigations have been previously performed to study the evolution of the olive tree complex (Olea europaea). A particularly high genomic diversity has been found in north-west Africa. However, to date no exhaustive study has been addressed to infer putative polyploidization events and their evolutionary significance in the diversification of the olive tree and its relatives. Methods Representatives of the six olive subspecies were investigated using (a) flow cytometry to estimate genome content, and (b) six highly variable nuclear microsatellites to assess the presence of multiple alleles at co-dominant loci. In addition, nine individuals from a controlled cross between two individuals of O. europaea subsp. maroccana were characterized with microsatellites to check for chromosome inheritance. Key Results Based on flow cytometry and genetic analyses, strong evidence for polyploidy was obtained in subspp. cerasiformis (tetraploid) and maroccana (hexaploid), whereas the other subspecies appeared to be diploids. Agreement between flow cytometry and genetic analyses gives an alternative approach to chromosome counting to determine ploidy level of trees. Lastly, abnormalities in chromosomes inheritance leading to aneuploid formation were revealed using microsatellite analyses in the offspring from the controlled cross in subsp. maroccana. Conclusions This study constitutes the first report for multiple polyploidy in olive tree relatives. Formation of tetraploids and hexaploids may have played a major role in the diversification of the olive complex in north-west Africa. The fact that polyploidy is found in narrow endemic subspecies from Madeira (subsp. cerasiformis) and the Agadir Mountains (subsp. maroccana) suggests that polyploidization has been favoured to overcome inbreeding depression. Lastly, based on previous phylogenetic analyses, we hypothesize that subsp. cerasiformis resulted from hybridization between ancestors

  13. Internal quality control of PCR-based genotyping methods in research studies and patient diagnostics

    DEFF Research Database (Denmark)

    Bladbjerg, Else-Marie; Gram, Jørgen; Jespersen, Jørgen

    2002-01-01

    Genetic analyses are increasingly integrated in the clinical laboratory, and internal quality control programmes are needed. We have focused on quality control aspects of selected polymorphism analyses used in thrombosis research. DNA was isolated from EDTA-blood (n = 500) by ammonium acetate....... Control of data handling revealed 0.1% reading mistakes and 0.5% entry mistakes. Based on our experiences we propose an internal quality control programme for widely used PCR-based haemostasis polymorphism analyses.......-isolation (pre-analytical factors), DNA-amplification, digestion with restriction enzymes, electrophoresis (analytical factors), result reading and entry into a database (post-analytical factors). Furthermore, we evaluated a procedure for result confirmation. Isolated DNA was of good quality (42 microg/ml blood...

  14. Feasibility study on sensor data fusion for the CP-140 aircraft: fusion architecture analyses

    Science.gov (United States)

    Shahbazian, Elisa

    1995-09-01

    Loral Canada completed (May 1995) a Department of National Defense (DND) Chief of Research and Development (CRAD) contract, to study the feasibility of implementing a multi- sensor data fusion (MSDF) system onboard the CP-140 Aurora aircraft. This system is expected to fuse data from: (a) attributed measurement oriented sensors (ESM, IFF, etc.); (b) imaging sensors (FLIR, SAR, etc.); (c) tracking sensors (radar, acoustics, etc.); (d) data from remote platforms (data links); and (e) non-sensor data (intelligence reports, environmental data, visual sightings, encyclopedic data, etc.). Based on purely theoretical considerations a central-level fusion architecture will lead to a higher performance fusion system. However, there are a number of systems and fusion architecture issues involving fusion of such dissimilar data: (1) the currently existing sensors are not designed to provide the type of data required by a fusion system; (2) the different types (attribute, imaging, tracking, etc.) of data may require different degree of processing, before they can be used within a fusion system efficiently; (3) the data quality from different sensors, and more importantly from remote platforms via the data links must be taken into account before fusing; and (4) the non-sensor data may impose specific requirements on the fusion architecture (e.g. variable weight/priority for the data from different sensors). This paper presents the analyses performed for the selection of the fusion architecture for the enhanced sensor suite planned for the CP-140 aircraft in the context of the mission requirements and environmental conditions.

  15. bc-GenExMiner 3.0: new mining module computes breast cancer gene expression correlation analyses.

    Science.gov (United States)

    Jézéquel, Pascal; Frénel, Jean-Sébastien; Campion, Loïc; Guérin-Charbonnel, Catherine; Gouraud, Wilfried; Ricolleau, Gabriel; Campone, Mario

    2013-01-01

    We recently developed a user-friendly web-based application called bc-GenExMiner (http://bcgenex.centregauducheau.fr), which offered the possibility to evaluate prognostic informativity of genes in breast cancer by means of a 'prognostic module'. In this study, we develop a new module called 'correlation module', which includes three kinds of gene expression correlation analyses. The first one computes correlation coefficient between 2 or more (up to 10) chosen genes. The second one produces two lists of genes that are most correlated (positively and negatively) to a 'tested' gene. A gene ontology (GO) mining function is also proposed to explore GO 'biological process', 'molecular function' and 'cellular component' terms enrichment for the output lists of most correlated genes. The third one explores gene expression correlation between the 15 telomeric and 15 centromeric genes surrounding a 'tested' gene. These correlation analyses can be performed in different groups of patients: all patients (without any subtyping), in molecular subtypes (basal-like, HER2+, luminal A and luminal B) and according to oestrogen receptor status. Validation tests based on published data showed that these automatized analyses lead to results consistent with studies' conclusions. In brief, this new module has been developed to help basic researchers explore molecular mechanisms of breast cancer. DATABASE URL: http://bcgenex.centregauducheau.fr

  16. Treatment algorithm based on the multivariate survival analyses in patients with advanced hepatocellular carcinoma treated with trans-arterial chemoembolization.

    Directory of Open Access Journals (Sweden)

    Hasmukh J Prajapati

    Full Text Available To develop the treatment algorithm from multivariate survival analyses (MVA in patients with Barcelona clinic liver cancer (BCLC C (advanced Hepatocellular carcinoma (HCC patients treated with Trans-arterial Chemoembolization (TACE.Consecutive unresectable and non-tranplantable patients with advanced HCC, who received DEB TACE were studied. A total of 238 patients (mean age, 62.4yrs was included in the study. Survivals were analyzed according to different parameters from the time of the 1st DEB TACE. Kaplan Meier and Cox Proportional Hazard model were used for survival analysis. The SS was constructed from MVA and named BCLC C HCC Prognostic (BCHP staging system (SS.Overall median survival (OS was 16.2 months. In HCC patients with venous thrombosis (VT of large vein [main portal vein (PV, right or left PV, hepatic vein, inferior vena cava] (22.7% versus small vein (segmental/subsegmental PV (9.7% versus no VT had OSs of 6.4 months versus 20 months versus 22.8 months respectively (p<0.001. On MVA, the significant independent prognostic factors (PFs of survival were CP class, eastern cooperative oncology group (ECOG performance status (PS, single HCC<5 cm, site of VT, metastases, serum creatinine and serum alpha-feto protein. Based on these PFs, the BCHP staging system was constructed. The OSs of stages I, II and III were 28.4 months, 11.8 months and 2.4 months accordingly (p<0.001. The treatment plan was proposed according to the different stages.On MVA of patients with advanced HCC treated with TACE, significant independent prognostic factors (PFs of survival were CP class, ECOG PS, single HCC<5 cm or others, site of VT, metastases, serum creatinine and serum alpha-feto protein. New BCHP SS was proposed based on MVA data to identify the suitable advanced HCC patients for TACE treatments.

  17. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.

    2013-01-01

    Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551

  18. Post-facta Analyses of Fukushima Accident and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Fumiya [Sociotechnical Systems Safety Research Institute, Ichige (Japan)

    2014-08-15

    Independent analyses have been performed of the core melt behavior of the Unit 1, Unit 2 and Unit 3 reactors of Fukushima Daiichi Nuclear Power Station on 11-15 March 2011. The analyses are based on a phenomenological methodology with measured data investigation and a simple physical model calculation. Estimated are time variation of core water level, core material temperature and hydrogen generation rate. The analyses have revealed characteristics of accident process of each reactor. In the case of Unit 2 reactor, the calculated result suggests little hydrogen generation because of no steam generation in the core for zirconium-steam reaction during fuel damage process. It could be the reason of no hydrogen explosion in the Unit 2 reactor building. Analyses have been performed also on the core material behavior in another chaotic period of 19-31 March 2011, and it resulted in a re-melt hypothesis that core material in each reactor should have melted again due to shortage of cooling water. The hypothesis is consistent with many observed features of radioactive materials dispersion into the environment.

  19. Application of the CALUX bioassay for epidemiological study. Analyses of Belgian human plasma

    Energy Technology Data Exchange (ETDEWEB)

    Wouwe, N. van; Debacker, N.; Sasse, A. [Scientific Institute of Public Health, Brussels (BE)] (and others)

    2004-09-15

    The CALUX bioassay is a promising screening method for the detection of dioxin-like compounds. The observed good sensitivity, low number of false negative results as well as the good correlations with the GC-HRMS TEQ-values in case of feed and food analyses allow this method to climb in the first assessment methods' scale. The low amount of sample needed in addition to those latest advantages suggest that the CALUX bioassay could be a good screening method for epidemiological studies. The Belgian epidemiological study concerning the possible effect of the dioxin incident on the body burden of the Belgian population was an opportunity to test this method in comparison to the gold reference one: the GC-HRMS. The first part of this abstract presents epidemiological parameters (sensibility, specificity,) of the CALUX bioassay using CALUX TEQ-values as estimators of the TEQ-values of the 17 PCDD/Fs. The second part examines epidemiological determinants observed for CALUX and GCHRMS TEQ-values.

  20. Agreement between the results of meta-analyses from case reports and from clinical studies regarding the efficacy of laronidase therapy in patients with mucopolysaccharidosis type I who initiated enzyme replacement therapy in adult age: An example of case reports meta-analyses as an useful tool for evidence-based medicine in rare diseases.

    Science.gov (United States)

    Sampayo-Cordero, Miguel; Miguel-Huguet, Bernat; Pardo-Mateos, Almudena; Moltó-Abad, Marc; Muñoz-Delgado, Cecilia; Pérez-López, Jordi

    2018-02-01

    Case reports might have a prominent role in the rare diseases field, due to the small number of patients affected by one such disease. A previous systematic review regarding the efficacy of laronidase therapy in patients with mucopolysaccharidosis type I (MPS-I) who initiated enzyme replacement therapy (ERT) in adult age has been published. The review included a meta-analysis of 19 clinical studies and the description of eleven case reports. It was of interest to perform a meta-analysis of those case reports to explore the role of such meta-analyses as a tool for evidence-based medicine in rare diseases. The study included all case reports with standard treatment regimen. Primary analysis was the percentage of case reports showing an improvement in a specific outcome. Only when that percentage was statistically higher than 5%, the improvement was confirmed as such. The outcomes that accomplished this criterion were ranked and compared to the GRADE criteria obtained by those same outcomes in the previous meta-analysis of clinical studies. There were three outcomes that had a significant improvement: Urine glycosaminoglycans, liver volume and 6-minute walking test. Positive and negative predictive values, sensitivity and specificity for the results of the meta-analysis of case reports as compared to that of clinical studies were 100%, 88.9%, 75% and 100%, respectively. Accordingly, absolute (Rho=0.82, 95%CI: 0.47 to 0.95) and relative agreement (Kappa=0.79, 95%CI: 0.593 to 0.99) between the number of case reports with improvement in a specific outcome and the GRADE evidence score for that outcome were good. Sensitivity analysis showed that agreement between the meta-analysis of case reports and that of the clinical studies were good only when using a strong confirmatory strategy for outcome improvement in case reports. We found an agreement between the results of meta-analyses from case reports and from clinical studies in the efficacy of laronidase therapy in

  1. The Network of Counterparty Risk: Analysing Correlations in OTC Derivatives.

    Science.gov (United States)

    Nanumyan, Vahan; Garas, Antonios; Schweitzer, Frank

    2015-01-01

    Counterparty risk denotes the risk that a party defaults in a bilateral contract. This risk not only depends on the two parties involved, but also on the risk from various other contracts each of these parties holds. In rather informal markets, such as the OTC (over-the-counter) derivative market, institutions only report their aggregated quarterly risk exposure, but no details about their counterparties. Hence, little is known about the diversification of counterparty risk. In this paper, we reconstruct the weighted and time-dependent network of counterparty risk in the OTC derivatives market of the United States between 1998 and 2012. To proxy unknown bilateral exposures, we first study the co-occurrence patterns of institutions based on their quarterly activity and ranking in the official report. The network obtained this way is further analysed by a weighted k-core decomposition, to reveal a core-periphery structure. This allows us to compare the activity-based ranking with a topology-based ranking, to identify the most important institutions and their mutual dependencies. We also analyse correlations in these activities, to show strong similarities in the behavior of the core institutions. Our analysis clearly demonstrates the clustering of counterparty risk in a small set of about a dozen US banks. This not only increases the default risk of the central institutions, but also the default risk of peripheral institutions which have contracts with the central ones. Hence, all institutions indirectly have to bear (part of) the counterparty risk of all others, which needs to be better reflected in the price of OTC derivatives.

  2. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  3. Analysing and evaluating the task of automatic tweet generation: Knowledge to business

    OpenAIRE

    Lloret, Elena; Palomar, Manuel

    2016-01-01

    In this paper a study concerning the evaluation and analysis of natural language tweets is presented. Based on our experience in text summarisation, we carry out a deep analysis on user's perception through the evaluation of tweets manual and automatically generated from news. Specifically, we consider two key issues of a tweet: its informativeness and its interestingness. Therefore, we analyse: (1) do users equally perceive manual and automatic tweets?; (2) what linguistic features a good tw...

  4. SuperTRI: A new approach based on branch support analyses of multiple independent data sets for assessing reliability of phylogenetic inferences.

    Science.gov (United States)

    Ropiquet, Anne; Li, Blaise; Hassanin, Alexandre

    2009-09-01

    Supermatrix and supertree are two methods for constructing a phylogenetic tree by using multiple data sets. However, these methods are not a panacea, as conflicting signals between data sets can lead to misinterpret the evolutionary history of taxa. In particular, the supermatrix approach is expected to be misleading if the species-tree signal is not dominant after the combination of the data sets. Moreover, most current supertree methods suffer from two limitations: (i) they ignore or misinterpret secondary (non-dominant) phylogenetic signals of the different data sets; and (ii) the logical basis of node robustness measures is unclear. To overcome these limitations, we propose a new approach, called SuperTRI, which is based on the branch support analyses of the independent data sets, and where the reliability of the nodes is assessed using three measures: the supertree Bootstrap percentage and two other values calculated from the separate analyses: the mean branch support (mean Bootstrap percentage or mean posterior probability) and the reproducibility index. The SuperTRI approach is tested on a data matrix including seven genes for 82 taxa of the family Bovidae (Mammalia, Ruminantia), and the results are compared to those found with the supermatrix approach. The phylogenetic analyses of the supermatrix and independent data sets were done using four methods of tree reconstruction: Bayesian inference, maximum likelihood, and unweighted and weighted maximum parsimony. The results indicate, firstly, that the SuperTRI approach shows less sensitivity to the four phylogenetic methods, secondly, that it is more accurate to interpret the relationships among taxa, and thirdly, that interesting conclusions on introgression and radiation can be drawn from the comparisons between SuperTRI and supermatrix analyses.

  5. Performance and Vibration Analyses of Lift-Offset Helicopters

    Directory of Open Access Journals (Sweden)

    Jeong-In Go

    2017-01-01

    Full Text Available A validation study on the performance and vibration analyses of the XH-59A compound helicopter is conducted to establish techniques for the comprehensive analysis of lift-offset compound helicopters. This study considers the XH-59A lift-offset compound helicopter using a rigid coaxial rotor system as a verification model. CAMRAD II (Comprehensive Analytical Method of Rotorcraft Aerodynamics and Dynamics II, a comprehensive analysis code, is used as a tool for the performance, vibration, and loads analyses. A general free wake model, which is a more sophisticated wake model than other wake models, is used to obtain good results for the comprehensive analysis. Performance analyses of the XH-59A helicopter with and without auxiliary propulsion are conducted in various flight conditions. In addition, vibration analyses of the XH-59A compound helicopter configuration are conducted in the forward flight condition. The present comprehensive analysis results are in good agreement with the flight test and previous analyses. Therefore, techniques for the comprehensive analysis of lift-offset compound helicopters are appropriately established. Furthermore, the rotor lifts are calculated for the XH-59A lift-offset compound helicopter in the forward flight condition to investigate the airloads characteristics of the ABC™ (Advancing Blade Concept rotor.

  6. A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.

    Science.gov (United States)

    Tipton, Elizabeth; Shuster, Jonathan

    2017-10-15

    Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Analysis of individual drug use as a time-varying determinant of exposure in prospective population-based cohort studies

    NARCIS (Netherlands)

    B.H.Ch. Stricker (Bruno); Th. Stijnen (Theo)

    2010-01-01

    textabstractIn pharmaco-epidemiology, the use of drugs is the determinant of interest when studying exposure-outcome associations. The increased availability of computerized information about drug use on an individual basis has greatly facilitated analyses of drug effects on a population-based

  8. Eating and drinking habits of young London-based Irish men: a qualitative study.

    OpenAIRE

    Kelly, Aidan; Ciclitira, Karen

    2011-01-01

    This qualitative study is based on interviews with young Irish men living in London about their diets and their views on healthy eating. The data were analysed using combined thematic and discourse analysis. Interviewees gave various reasons for not adopting healthy eating habits, including the cost of healthy foods, their lack of time and ability to cook, and their prioritisation of drinking. Views about the status of different foods also affected their eating habits: they considered red mea...

  9. A Corpus-based Study of EFL Learners’ Errors in IELTS Essay Writing

    Directory of Open Access Journals (Sweden)

    Hoda Divsar

    2017-03-01

    Full Text Available The present study analyzed different types of errors in the EFL learners’ IELTS essays. In order to determine the major types of errors, a corpus of 70 IELTS examinees’ writings were collected, and their errors were extracted and categorized qualitatively. Errors were categorized based on a researcher-developed error-coding scheme into 13 aspects. Based on the descriptive statistical analyses, the frequency of each error type was calculated and the commonest errors committed by the EFL learners in IELTS essays were identified. The results indicated that the two most frequent errors that IELTS candidates committed were related to word choice and verb forms. Based on the research results, pedagogical implications highlight analyzing EFL learners’ writing errors as a useful basis for instructional purposes including creating pedagogical teaching materials that are in line with learners’ linguistic strengths and weaknesses.

  10. Large-scale genome-wide association studies and meta-analyses of longitudinal change in adult lung function.

    Directory of Open Access Journals (Sweden)

    Wenbo Tang

    Full Text Available Genome-wide association studies (GWAS have identified numerous loci influencing cross-sectional lung function, but less is known about genes influencing longitudinal change in lung function.We performed GWAS of the rate of change in forced expiratory volume in the first second (FEV1 in 14 longitudinal, population-based cohort studies comprising 27,249 adults of European ancestry using linear mixed effects model and combined cohort-specific results using fixed effect meta-analysis to identify novel genetic loci associated with longitudinal change in lung function. Gene expression analyses were subsequently performed for identified genetic loci. As a secondary aim, we estimated the mean rate of decline in FEV1 by smoking pattern, irrespective of genotypes, across these 14 studies using meta-analysis.The overall meta-analysis produced suggestive evidence for association at the novel IL16/STARD5/TMC3 locus on chromosome 15 (P  =  5.71 × 10(-7. In addition, meta-analysis using the five cohorts with ≥3 FEV1 measurements per participant identified the novel ME3 locus on chromosome 11 (P  =  2.18 × 10(-8 at genome-wide significance. Neither locus was associated with FEV1 decline in two additional cohort studies. We confirmed gene expression of IL16, STARD5, and ME3 in multiple lung tissues. Publicly available microarray data confirmed differential expression of all three genes in lung samples from COPD patients compared with controls. Irrespective of genotypes, the combined estimate for FEV1 decline was 26.9, 29.2 and 35.7 mL/year in never, former, and persistent smokers, respectively.In this large-scale GWAS, we identified two novel genetic loci in association with the rate of change in FEV1 that harbor candidate genes with biologically plausible functional links to lung function.

  11. Analyses of microstructural and elastic properties of porous SOFC cathodes based on focused ion beam tomography

    Science.gov (United States)

    Chen, Zhangwei; Wang, Xin; Giuliani, Finn; Atkinson, Alan

    2015-01-01

    Mechanical properties of porous SOFC electrodes are largely determined by their microstructures. Measurements of the elastic properties and microstructural parameters can be achieved by modelling of the digitally reconstructed 3D volumes based on the real electrode microstructures. However, the reliability of such measurements is greatly dependent on the processing of raw images acquired for reconstruction. In this work, the actual microstructures of La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF) cathodes sintered at an elevated temperature were reconstructed based on dual-beam FIB/SEM tomography. Key microstructural and elastic parameters were estimated and correlated. Analyses of their sensitivity to the grayscale threshold value applied in the image segmentation were performed. The important microstructural parameters included porosity, tortuosity, specific surface area, particle and pore size distributions, and inter-particle neck size distribution, which may have varying extent of effect on the elastic properties simulated from the microstructures using FEM. Results showed that different threshold value range would result in different degree of sensitivity for a specific parameter. The estimated porosity and tortuosity were more sensitive than surface area to volume ratio. Pore and neck size were found to be less sensitive than particle size. Results also showed that the modulus was essentially sensitive to the porosity which was largely controlled by the threshold value.

  12. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  13. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  14. Regional analyses of labor markets and demography: a model based Norwegian example.

    Science.gov (United States)

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  15. Understanding the Greenhouse Effect by Embodiment - Analysing and Using Students' and Scientists' Conceptual Resources

    Science.gov (United States)

    Niebert, Kai; Gropengießer, Harald

    2014-01-01

    Over the last 20 years, science education studies have reported that there are very different understandings among students of science regarding the key aspects of climate change. We used the cognitive linguistic framework of experientialism to shed new light on this valuable pool of studies to identify the conceptual resources of understanding climate change. In our study, we interviewed 35 secondary school students on their understanding of the greenhouse effect and analysed the conceptions of climate scientists as drawn from textbooks and research reports. We analysed all data by metaphor analysis and qualitative content analysis to gain insight into students' and scientists' resources for understanding. In our analysis, we found that students and scientists refer to the same schemata to understand the greenhouse effect. We categorised their conceptions into three different principles the conceptions are based on: warming by more input, warming by less output, and warming by a new equilibrium. By interrelating students' and scientists' conceptions, we identified the students' learning demand: First, our students were afforded with experiences regarding the interactions of electromagnetic radiation and CO2. Second, our students reflected about the experience-based schemata they use as source domains for metaphorical understanding of the greenhouse effect. By uncovering the-mostly unconscious-deployed schemata, we gave students access to their source domains. We implemented these teaching guidelines in interventions and evaluated them in teaching experiments to develop evidence-based and theory-guided learning activities on the greenhouse effect.

  16. Sao Paulo Lightning Mapping Array (SP-LMA): Network Assessment and Analyses for Intercomparison Studies and GOES-R Proxy Activities

    Science.gov (United States)

    Bailey, J. C.; Blakeslee, R. J.; Carey, L. D.; Goodman, S. J.; Rudlosky, S. D.; Albrecht, R.; Morales, C. A.; Anselmo, E. M.; Neves, J. R.; Buechler, D. E.

    2014-01-01

    A 12 station Lightning Mapping Array (LMA) network was deployed during October 2011 in the vicinity of Sao Paulo, Brazil (SP-LMA) to contribute total lightning measurements to an international field campaign [CHUVA - Cloud processes of tHe main precipitation systems in Brazil: A contribUtion to cloud resolVing modeling and to the GPM (GlobAl Precipitation Measurement)]. The SP-LMA was operational from November 2011 through March 2012 during the Vale do Paraiba campaign. Sensor spacing was on the order of 15-30 km, with a network diameter on the order of 40-50km. The SP-LMA provides good 3-D lightning mapping out to 150 km from the network center, with 2-D coverage considerably farther. In addition to supporting CHUVA science/mission objectives, the SP-LMA is supporting the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), on NOAA's Geostationary Operational Environmental Satellite-R (GOES-R: scheduled for a 2015 launch). These proxy data will be used to develop and validate operational algorithms so that they will be ready to use on "day1" following the GOES-R launch. As the CHUVA Vale do Paraiba campaign opportunity was formulated, a broad community-based interest developed for a comprehensive Lightning Location System (LLS) intercomparison and assessment study, leading to the participation and/or deployment of eight other ground-based networks and the space-based Lightning Imaging Sensor (LIS). The SP-LMA data is being intercompared with lightning observations from other deployed lightning networks to advance our understanding of the capabilities/contributions of each of these networks toward GLM proxy and validation activities. This paper addresses the network assessment including noise reduction criteria, detection efficiency estimates, and statistical and climatological (both temporal and spatially) analyses for intercomparison studies and GOES-R proxy activities.

  17. Detailed semantic analyses of human error incidents occurring at domestic nuclear power plants to fiscal year 2000

    International Nuclear Information System (INIS)

    Tsuge, Tadashi; Hirotsu, Yuko; Takano, Kenichi; Ebisu, Mitsuhiro; Tsumura, Joji

    2003-01-01

    Analysing and evaluating observed cases of human error incidents with the emphasis on human factors and behavior involved was essential for preventing recurrence of those. CRIEPI has been conducting detailed and structures analyses of all incidents reported during last 35 year based on J-HPES, from the beginning of the first Tokai nuclear power operation till fiscal year of 2000, in which total 212 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES data-base. This summarized the semantic analyses on all case-studies stored in the above data-base to grasp the practical and concrete contents and trend of more frequently observed human errors (as are called trigger actions here), causal factors and preventive measures. These semantic analyses have been executed by classifying all those items into some categories that could be considered as having almost the same meaning using the KJ method. Followings are obtained typical results by above analyses: (1) Trigger action-Those could be classified into categories of operation or categories of maintenance. Operational timing errors' and 'operational quantitative errors' were major actions in trigger actions of operation, those occupied about 20% among all actions. At trigger actions of maintenance, 'maintenance quantitative error' were major actions, those occupied quarter among all actions; (2) Causal factor- 'Human internal status' were major factors, as in concrete factors, those occupied 'improper persistence' and 'lack of knowledge'; (3) Preventive measure-Most frequent measures got were job management changes in procedural software improvements, which was from 70% to 80%. As for preventive measures of operation, software improvements have been implemented on 'organization and work practices' and 'individual consciousness'. Concerning preventive measures of maintenance, improvements have been implemented on 'organization and work practices'. (author)

  18. Transmission Characteristics of Primate Vocalizations: Implications for Acoustic Analyses

    Science.gov (United States)

    Maciej, Peter; Fischer, Julia; Hammerschmidt, Kurt

    2011-01-01

    Acoustic analyses have become a staple method in field studies of animal vocal communication, with nearly all investigations using computer-based approaches to extract specific features from sounds. Various algorithms can be used to extract acoustic variables that may then be related to variables such as individual identity, context or reproductive state. Habitat structure and recording conditions, however, have strong effects on the acoustic structure of sound signals. The purpose of this study was to identify which acoustic parameters reliably describe features of propagated sounds. We conducted broadcast experiments and examined the influence of habitat type, transmission height, and re-recording distance on the validity (deviation from the original sound) and reliability (variation within identical recording conditions) of acoustic features of different primate call types. Validity and reliability varied independently of each other in relation to habitat, transmission height, and re-recording distance, and depended strongly on the call type. The smallest deviations from the original sounds were obtained by a visually-controlled calculation of the fundamental frequency. Start- and end parameters of a sound were most susceptible to degradation in the environment. Because the recording conditions can have appreciable effects on acoustic parameters, it is advisable to validate the extraction method of acoustic variables from recordings over longer distances before using them in acoustic analyses. PMID:21829682

  19. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    Directory of Open Access Journals (Sweden)

    Ester Vilaprinyo

    Full Text Available The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1 To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2 To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial, the starting ages (40, 45 and 50 years and the ending ages (69 and 74 years in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  20. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  1. Environmental Sound Perception: Metadescription and Modeling Based on Independent Primary Studies

    Directory of Open Access Journals (Sweden)

    Stephen McAdams

    2010-01-01

    Full Text Available The aim of the study is to transpose and extend to a set of environmental sounds the notion of sound descriptors usually used for musical sounds. Four separate primary studies dealing with interior car sounds, air-conditioning units, car horns, and closing car doors are considered collectively. The corpus formed by these initial stimuli is submitted to new experimental studies and analyses, both for revealing metacategories and for defining more precisely the limits of each of the resulting categories. In a second step, the new structure is modeled: common and specific dimensions within each category are derived from the initial results and new investigations of audio features are performed. Furthermore, an automatic classifier based on two audio descriptors and a multinomial logistic regression procedure is implemented and validated with the corpus.

  2. Aging of monolithic zirconia dental prostheses: Protocol for a 5-year prospective clinical study using ex vivo analyses.

    Science.gov (United States)

    Koenig, Vinciane; Wulfman, Claudine P; Derbanne, Mathieu A; Dupont, Nathalie M; Le Goff, Stéphane O; Tang, Mie-Leng; Seidel, Laurence; Dewael, Thibaut Y; Vanheusden, Alain J; Mainjot, Amélie K

    2016-12-15

    Recent introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) monolithic zirconia dental prostheses raises the issue of material low thermal degradation (LTD), a well-known problem with zirconia hip prostheses. This phenomenon could be accentuated by masticatory mechanical stress. Until now zirconia LTD process has only been studied in vitro . This work introduces an original protocol to evaluate LTD process of monolithic zirconia prostheses in the oral environment and to study their general clinical behavior, notably in terms of wear. 101 posterior monolithic zirconia tooth elements (molars and premolars) are included in a 5-year prospective clinical trial. On each element, several areas between 1 and 2 mm 2 (6 on molars, 4 on premolars) are determined on restoration surface: areas submitted or non-submitted to mastication mechanical stress, glazed or non-glazed. Before prosthesis placement, ex vivo analyses regarding LTD and wear are performed using Raman spectroscopy, SEM imagery and 3D laser profilometry. After placement, restorations are clinically evaluated following criteria of the World Dental Federation (FDI), complemented by the analysis of fracture clinical risk factors. Two independent examiners perform the evaluations. Clinical evaluation and ex vivo analyses are carried out after 6 months and then each year for up to 5 years. For clinicians and patients, the results of this trial will justify the use of monolithic zirconia restorations in dental practice. For researchers, the originality of a clinical study including ex vivo analyses of material aging will provide important data regarding zirconia properties.Trial registration: ClinicalTrials.gov Identifier: NCT02150226.

  3. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    Science.gov (United States)

    Hoľko, Michal; Stacho, Jakub

    2014-12-01

    The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.

  4. [Cost-Effectiveness and Cost-Utility Analyses of Antireflux Medicine].

    Science.gov (United States)

    Gockel, Ines; Lange, Undine Gabriele; Schürmann, Olaf; Jansen-Winkeln, Boris; Sibbel, Rainer; Lyros, Orestis; von Dercks, Nikolaus

    2018-04-12

    Laparoscopic antireflux surgery and medical therapy with proton pump inhibitors are gold standards of gastroesophageal reflux treatment. On account of limited resources and increasing healthcare needs and costs, in this analysis, not only optimal medical results, but also superiority in health economics of these 2 methods are evaluated. We performed an electronic literature survey in MEDLINE, PubMed, Cochrane Library, ISRCTN (International Standard Randomization Controlled Trial Number) as well as in the NHS Economic Evaluation Database, including studies published until 1/2017. Only studies considering the effect size of QALY (Quality-Adjusted Life Years) (with respect to different quality of life-scores) as primary outcome comparing laparoscopic fundoplication and medical therapy were included. Criteria of comparison were ICER (Incremental Cost-Effectiveness Ratio) and ICUR (Incremental Cost-Utility Ratio). Superiority of the respective treatment option for each publication was worked out. In total, 18 comparative studies were identified in the current literature with respect to above-mentioned search terms, qualifying for the defined inclusion criteria. Six studies were finally selected for analyses. Out of 6 publications, 3 showed superiority of laparoscopic fundoplication over long-term medical management based on current cost-effectiveness data. Limitations were related to different time intervals, levels of evidence of studies and underlying resources/costs of analyses, healthcare systems and applied quality of life instruments. Future prospective, randomized trials should examine this comparison in greater detail. Additionally, there is a large potential for further research in the health economics assessment of early diagnosis and prevention measures of reflux disease and Barrett's esophagus/carcinoma. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Quality of meta-analyses in major leading gastroenterology and hepatology journals: A systematic review.

    Science.gov (United States)

    Liu, Pengfei; Qiu, Yuanyu; Qian, Yuting; Chen, Xiao; Wang, Yiran; Cui, Jin; Zhai, Xiao

    2017-01-01

    To appraise the current reporting methodological quality of meta-analyses in five leading gastroenterology and hepatology journals, and to identify the variables associated with the reporting quality. We systematically searched the literature of meta-analyses in Gastroenterology, Gut, Hepatology, Journal of Hepatology (J HEPATOL) and American Journal of Gastroenterology (AM J GASTROENTEROL) from 2006 to 2008 and from 2012 to 2014. Characteristics were extracted based on the PRISMA statement and the AMSTAR tool. Country, number of patients, funding source were also revealed and descriptively reported. A total of 127 meta-analyses were enrolled in this study and were compared among journals, study years, and other characters. Compliances with the PRISMA statement and the AMSTAR checklist were 20.8 ± 4.2 out of a maximum of 27 and 7.6 ± 2.4 out of a maximum of 11, respectively. Some domains were poorly reported including describing a protocol and/or registration (item 5, 0.0%), describing methods, and giving results of additional analyses (item 16, 45.7% and item 23, 48.0%) for PRISMA and duplicating study selection and data extraction (item 2, 53.5%), and providing a list of included and excluded studies (item 5, 14.2%) for AMSTAR. Publication in recent years showed a significantly better methodological quality than those published in previous years. This study shows that methodological reporting quality of MAs in the major gastroenterology and hepatology journals has improved in recent years after the publication of the developed PRISMA statement, and it can be further improved. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  6. Web-based office ergonomics intervention on work-related complaints: a field study.

    Science.gov (United States)

    Meinert, Marina; König, Mirjam; Jaschinski, Wolfgang

    2013-01-01

    The aim of this study was a proof of concept to examine the effects of a Web-based office ergonomics intervention on subjects' individual workplace adjustments. An intervention study was conducted with 24 office workers lasting 6 weeks with three consecutive phases (before, 1 and 5 weeks after the intervention). Employees used a purpose-made website for adjusting their computer workplaces without any personal support of ergonomics experts. Workplace measurements were taken directly on site and by analysing photos taken of the employee. Self-reported complaints were assessed by filling in a questionnaire. It was found that 96% of the employees changed their workplaces on their own and retained them mostly unchanged after the intervention. Furthermore, self-reported musculoskeletal complaints and headache symptoms decreased significantly after the intervention. These findings suggest an improvement of workplace conditions so that cost-effective ergonomic Web-based interventions appear promising in further research and application.

  7. Effect of an evidence-based website on healthcare usage: an interrupted time-series study

    Science.gov (United States)

    Spoelman, Wouter A; Bonten, Tobias N; de Waal, Margot W M; Drenthen, Ton; Smeele, Ivo J M; Nielen, Markus M J; Chavannes, Niels H

    2016-01-01

    Objectives Healthcare costs and usage are rising. Evidence-based online health information may reduce healthcare usage, but the evidence is scarce. The objective of this study was to determine whether the release of a nationwide evidence-based health website was associated with a reduction in healthcare usage. Design Interrupted time series analysis of observational primary care data of healthcare use in the Netherlands from 2009 to 2014. Setting General community primary care. Population 912 000 patients who visited their general practitioners 18.1 million times during the study period. Intervention In March 2012, an evidence-based health information website was launched by the Dutch College of General Practitioners. It was easily accessible and understandable using plain language. At the end of the study period, the website had 2.9 million unique page views per month. Main outcomes measures Primary outcome was the change in consultation rate (consultations/1000 patients/month) before and after the release of the website. Additionally, a reference group was created by including consultations about topics not being viewed at the website. Subgroup analyses were performed for type of consultations, sex, age and socioeconomic status. Results After launch of the website, the trend in consultation rate decreased with 1.620 consultations/1000 patients/month (p<0.001). This corresponds to a 12% decline in consultations 2 years after launch of the website. The trend in consultation rate of the reference group showed no change. The subgroup analyses showed a specific decline for consultations by phone and were significant for all other subgroups, except for the youngest age group. Conclusions Healthcare usage decreased by 12% after providing high-quality evidence-based online health information. These findings show that e-Health can be effective to improve self-management and reduce healthcare usage in times of increasing healthcare costs. PMID:28186945

  8. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  9. Conserved regulators of nucleolar size revealed by global phenotypic analyses.

    Science.gov (United States)

    Neumüller, Ralph A; Gross, Thomas; Samsonova, Anastasia A; Vinayagam, Arunachalam; Buckner, Michael; Founk, Karen; Hu, Yanhui; Sharifpoor, Sara; Rosebrock, Adam P; Andrews, Brenda; Winston, Fred; Perrimon, Norbert

    2013-08-20

    Regulation of cell growth is a fundamental process in development and disease that integrates a vast array of extra- and intracellular information. A central player in this process is RNA polymerase I (Pol I), which transcribes ribosomal RNA (rRNA) genes in the nucleolus. Rapidly growing cancer cells are characterized by increased Pol I-mediated transcription and, consequently, nucleolar hypertrophy. To map the genetic network underlying the regulation of nucleolar size and of Pol I-mediated transcription, we performed comparative, genome-wide loss-of-function analyses of nucleolar size in Saccharomyces cerevisiae and Drosophila melanogaster coupled with mass spectrometry-based analyses of the ribosomal DNA (rDNA) promoter. With this approach, we identified a set of conserved and nonconserved molecular complexes that control nucleolar size. Furthermore, we characterized a direct role of the histone information regulator (HIR) complex in repressing rRNA transcription in yeast. Our study provides a full-genome, cross-species analysis of a nuclear subcompartment and shows that this approach can identify conserved molecular modules.

  10. Conserved Regulators of Nucleolar Size Revealed by Global Phenotypic Analyses

    Science.gov (United States)

    Neumüller, Ralph A.; Gross, Thomas; Samsonova, Anastasia A.; Vinayagam, Arunachalam; Buckner, Michael; Founk, Karen; Hu, Yanhui; Sharifpoor, Sara; Rosebrock, Adam P.; Andrews, Brenda; Winston, Fred; Perrimon, Norbert

    2014-01-01

    Regulation of cell growth is a fundamental process in development and disease that integrates a vast array of extra- and intracellular information. A central player in this process is RNA polymerase I (Pol I), which transcribes ribosomal RNA (rRNA) genes in the nucleolus. Rapidly growing cancer cells are characterized by increased Pol I–mediated transcription and, consequently, nucleolar hypertrophy. To map the genetic network underlying the regulation of nucleolar size and of Pol I–mediated transcription, we performed comparative, genome-wide loss-of-function analyses of nucleolar size in Saccharomyces cerevisiae and Drosophila melanogaster coupled with mass spectrometry–based analyses of the ribosomal DNA (rDNA) promoter. With this approach, we identified a set of conserved and nonconserved molecular complexes that control nucleolar size. Furthermore, we characterized a direct role of the histone information regulator (HIR) complex in repressing rRNA transcription in yeast. Our study provides a full-genome, cross-species analysis of a nuclear subcompartment and shows that this approach can identify conserved molecular modules. PMID:23962978

  11. Benchmark exercises on PWR level-1 PSA (step 3). Analyses of accident sequence and conclusions

    International Nuclear Information System (INIS)

    Niwa, Yuji; Takahashi, Hideaki.

    1996-01-01

    The results of level 1 PSA generate fluctuations due to the assumptions based on several engineering judgements set in the stages of PSA analysis. On the purpose of the investigation of uncertainties due to assumptions, three kinds of a standard problem, what we call benchmark exercise have been set. In this report, sensitivity studies (benchmark exercise) of sequence analyses are treated and conclusions are mentioned. The treatment of inter-system dependency would generate uncertainly of PSA. In addition, as a conclusion of the PSA benchmark exercise, several findings in the sequence analysis together with previous benchmark analyses in earlier INSS Journals are treated. (author)

  12. Publication bias in dermatology systematic reviews and meta-analyses.

    Science.gov (United States)

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  13. Sensitivity studies for 3-D rod ejection analyses on axial power shape

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min-Ho; Park, Jin-Woo; Park, Guen-Tae; Ryu, Seok-Hee; Um, Kil-Sup; Lee, Jae-Il [KEPCO NF, Daejeon (Korea, Republic of)

    2015-10-15

    The current safety analysis methodology using the point kinetics model combined with numerous conservative assumptions result in unrealistic prediction of the transient behavior wasting huge margin for safety analyses while the safety regulation criteria for the reactivity initiated accident are going strict. To deal with this, KNF is developing a 3-D rod ejection analysis methodology using the multi-dimensional code coupling system CHASER. The CHASER system couples three-dimensional core neutron kinetics code ASTRA, sub-channel analysis code THALES, and fuel performance analysis code FROST using message passing interface (MPI). A sensitivity study for 3-D rod ejection analysis on axial power shape (APS) is carried out to survey the tendency of safety parameters by power distributions and to build up a realistic safety analysis methodology while maintaining conservatism. The currently developing 3-D rod ejection analysis methodology using the multi-dimensional core transient analysis code system, CHASER was shown to reasonably reflect the conservative assumptions by tuning up kinetic parameters.

  14. Does health differ between participants and non-participants in the MRI-HUNT study, a population based neuroimaging study? The Nord-Trøndelag health studies 1984–2009

    International Nuclear Information System (INIS)

    Honningsvåg, Lasse-Marius; Linde, Mattias; Håberg, Asta; Stovner, Lars Jacob; Hagen, Knut

    2012-01-01

    Bias with regard to participation in epidemiological studies can have a large impact on the generalizability of results. Our aim was to investigate the direction and magnitude of potential bias by comparing health-related factors among participants and non-participants in a MRI-study based on HUNT, a large Norwegian health survey. Of 14,033 individuals aged 50–65, who had participated in all three large public health surveys within the Norwegian county of Nord-Trøndelag (HUNT 1, 2 and 3), 1,560 who lived within 45 minutes of travel from the city of Levanger were invited to a MRI study (MRI-HUNT). The sample of participants in MRI-HUNT (n = 1,006) were compared with those who were invited but did not participate (n = 554) and with those who were eligible but not invited (n = 12,473), using univariate analyses and logistic regression analyses adjusting for age and education level. Self-reported health did not differ between the three groups, but participants had a higher education level and were somewhat younger than the two other groups. In the adjusted multivariate analyses, obesity was consistently less prevalent among participants. Significant differences in blood pressure and cholesterol were also found. This is the first large population-based study comparing participants and non-participants in an MRI study with regard to general health. The groups were not widely different, but participants had a higher level of education, and were less likely to be obese and have hypertension, and were slightly younger than non-participants. The observed differences between participants and non-invited individuals are probably partly explained by the inclusion criterion that participants had to live within 45 minutes of transport to where the MRI examination took place. One will expect that the participants have somewhat less brain morphological changes related to cardiovascular risk factors than the general population. Such consequences underline the crucial importance

  15. Birth Spacing of Pregnant Women in Nepal: A Community-Based Study.

    Science.gov (United States)

    Karkee, Rajendra; Lee, Andy H

    2016-01-01

    Optimal birth spacing has health advantages for both mother and child. In developing countries, shorter birth intervals are common and associated with social, cultural, and economic factors, as well as a lack of family planning. This study investigated the first birth interval after marriage and preceding interbirth interval in Nepal. A community-based prospective cohort study was conducted in the Kaski district of Nepal. Information on birth spacing, demographic, and obstetric characteristics was obtained from 701 pregnant women using a structured questionnaire. Logistic regression analyses were performed to ascertain factors associated with short birth spacing. About 39% of primiparous women gave their first child birth within 1 year of marriage and 23% of multiparous women had short preceding interbirth intervals (gender equality in society.

  16. The Oslo Health Study: The impact of self-selection in a large, population-based survey

    Science.gov (United States)

    Søgaard, Anne Johanne; Selmer, Randi; Bjertness, Espen; Thelle, Dag

    2004-01-01

    Background Research on health equity which mainly utilises population-based surveys, may be hampered by serious selection bias due to a considerable number of invitees declining to participate. Sufficient information from all the non-responders is rarely available to quantify this bias. Predictors of attendance, magnitude and direction of non-response bias in prevalence estimates and association measures, are investigated based on information from all 40 888 invitees to the Oslo Health Study. Methods The analyses were based on linkage between public registers in Statistics Norway and the Oslo Health Study, a population-based survey conducted in 2000/2001 inviting all citizens aged 30, 40, 45, 59–60 and 75–76 years. Attendance was 46%. Weighted analyses, logistic regression and sensitivity analyses are performed to evaluate possible selection bias. Results The response rate was positively associated with age, educational attendance, total income, female gender, married, born in a Western county, living in the outer city residential regions and not receiving disability benefit. However, self-rated health, smoking, BMI and mental health (HCSL) in the attendees differed only slightly from estimated prevalence values in the target population when weighted by the inverse of the probability of attendance. Observed values differed only moderately provided that the non-attending individuals differed from those attending by no more than 50%. Even though persons receiving disability benefit had lower attendance, the associations between disability and education, residential region and marital status were found to be unbiased. The association between country of birth and disability benefit was somewhat more evident among attendees. Conclusions Self-selection according to sociodemographic variables had little impact on prevalence estimates. As indicated by disability benefit, unhealthy persons attended to a lesser degree than healthy individuals, but social inequality in

  17. The Oslo Health Study: The impact of self-selection in a large, population-based survey

    Directory of Open Access Journals (Sweden)

    Bjertness Espen

    2004-05-01

    Full Text Available Abstract Background Research on health equity which mainly utilises population-based surveys, may be hampered by serious selection bias due to a considerable number of invitees declining to participate. Sufficient information from all the non-responders is rarely available to quantify this bias. Predictors of attendance, magnitude and direction of non-response bias in prevalence estimates and association measures, are investigated based on information from all 40 888 invitees to the Oslo Health Study. Methods The analyses were based on linkage between public registers in Statistics Norway and the Oslo Health Study, a population-based survey conducted in 2000/2001 inviting all citizens aged 30, 40, 45, 59–60 and 75–76 years. Attendance was 46%. Weighted analyses, logistic regression and sensitivity analyses are performed to evaluate possible selection bias. Results The response rate was positively associated with age, educational attendance, total income, female gender, married, born in a Western county, living in the outer city residential regions and not receiving disability benefit. However, self-rated health, smoking, BMI and mental health (HCSL in the attendees differed only slightly from estimated prevalence values in the target population when weighted by the inverse of the probability of attendance. Observed values differed only moderately provided that the non-attending individuals differed from those attending by no more than 50%. Even though persons receiving disability benefit had lower attendance, the associations between disability and education, residential region and marital status were found to be unbiased. The association between country of birth and disability benefit was somewhat more evident among attendees. Conclusions Self-selection according to sociodemographic variables had little impact on prevalence estimates. As indicated by disability benefit, unhealthy persons attended to a lesser degree than healthy individuals

  18. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  19. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  20. Auto-ignition generated combustion. Pt. 2. Thermodynamic fundamentals; Verbrennungssteuerung durch Selbstzuendung. T. 2. Experimentelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Guibert, P. [Paris-6 Univ. (France). Lab. de Mecanique Physique; Morin, C. [Paris-6 Univ. (France); Mokhtari, S.

    2004-02-01

    The combustion initiation by auto-ignition demonstrates benefits in NO{sub x} reduction and in process stability for both spark-ignited and compression ignited engines. Based on the better thermodynamic particularities of the auto-ignition, which have been presented in the first part, the characteristics of this process are demonstrated in the second part by experimental analysis. For comparison with similar studies, the analyses have been carried out in base of a two stroke loop scavenged spark-ignition single cylinder engine. (orig.) [German] Die Steuerung der Verbrennung durch Selbstzuendung zeigt Vorteile bezueglich Senkung der NO{sub x}-Emission und Prozessstabilitaet, sowohl bei Otto- als auch bei Dieselmotoren. Auf Grundlage der thermodynamischen Besonderheiten der Selbstzuendvorgaenge, die im ersten Teil praesentiert wurden, erfolgt im zweiten Teil eine experimentelle Betrachtung der Prozesscharakteristika. Zur Vergleichbarkeit mit aehnlichen Untersuchungen wird die experimentelle Analyse auf Basis eines Zweitakt-Einzylinder-Ottomotors mit Umkehrspuelung durchgefuehrt. (orig.)

  1. The impact of parent-child interaction on brain structures: cross-sectional and longitudinal analyses.

    Science.gov (United States)

    Takeuchi, Hikaru; Taki, Yasuyuki; Hashizume, Hiroshi; Asano, Kohei; Asano, Michiko; Sassa, Yuko; Yokota, Susumu; Kotozaki, Yuka; Nouchi, Rui; Kawashima, Ryuta

    2015-02-04

    There is a vast amount of evidence from psychological studies that the amount of parent-child interaction affects the development of children's verbal skills and knowledge. However, despite the vast amount of literature, brain structural development associated with the amount of parent-child interaction has never been investigated. In the present human study, we used voxel-based morphometry to measure regional gray matter density (rGMD) and examined cross-sectional correlations between the amount of time spent with parents and rGMD among 127 boys and 135 girls. We also assessed correlations between the amount of time spent with parents and longitudinal changes that occurred a few years later among 106 boys and 102 girls. After correcting for confounding factors, we found negative effects of spending time with parents on rGMD in areas in the bilateral superior temporal gyrus (STG) via cross-sectional analyses as well as in the contingent areas of the right STG. We also confirmed positive effects of spending time with parents on the Verbal Comprehension score in cross-sectional and longitudinal analyses. rGMD in partly overlapping or contingent areas of the right STG was negatively correlated with age and the Verbal Comprehension score in cross-sectional analyses. Subsequent analyses revealed verbal parent-child interactions have similar effects on Verbal Comprehension scores and rGMD in the right STG in both cross-sectional and longitudinal analyses. These findings indicate that parent-child interactions affect the right STG, which may be associated with verbal skills. Copyright © 2015 the authors 0270-6474/15/352233-13$15.00/0.

  2. On the use of uncertainty analyses to test hypotheses regarding deterministic model predictions of environmental processes

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bittner, E.A.; Essington, E.H.

    1995-01-01

    This paper illustrates the use of Monte Carlo parameter uncertainty and sensitivity analyses to test hypotheses regarding predictions of deterministic models of environmental transport, dose, risk and other phenomena. The methodology is illustrated by testing whether 238 Pu is transferred more readily than 239+240 Pu from the gastrointestinal (GI) tract of cattle to their tissues (muscle, liver and blood). This illustration is based on a study wherein beef-cattle grazed for up to 1064 days on a fenced plutonium (Pu)-contaminated arid site in Area 13 near the Nevada Test Site in the United States. Periodically, cattle were sacrificed and their tissues analyzed for Pu and other radionuclides. Conditional sensitivity analyses of the model predictions were also conducted. These analyses indicated that Pu cattle tissue concentrations had the largest impact of any model parameter on the pdf of predicted Pu fractional transfers. Issues that arise in conducting uncertainty and sensitivity analyses of deterministic models are discussed. (author)

  3. Aroma profile of Garnacha Tintorera-based sweet wines by chromatographic and sensorial analyses.

    Science.gov (United States)

    Noguerol-Pato, R; González-Álvarez, M; González-Barreiro, C; Cancho-Grande, B; Simal-Gándara, J

    2012-10-15

    The aroma profiles obtained of three Garnacha Tintorera-based wines were studied: a base wine, a naturally sweet wine, and a mixture of naturally sweet wine with other sweet wine obtained by fortification with spirits. The aroma fingerprint was traced by GC-MS analysis of volatile compounds and by sensorial analysis of odours and tastes. Within the volatiles compounds, sotolon (73 μg/L) and acetoin (122 μg/L) were the two main compounds found in naturally sweet wine. With regards to the odorant series, those most dominant for Garnacha Tintorera base wine were floral, fruity and spicy. Instead, the most marked odorant series affected by off-vine drying of the grapes were floral, caramelized and vegetal-wood. Finally, odorant series affected by the switch-off of alcoholic fermentation with ethanol 96% (v/v) fit for human consumption followed by oak barrel aging were caramelized and vegetal-wood. A partial least square test (PLS-2) was used to detect correlations between sets of sensory data (those obtained with mouth and nose) with the ultimate aim of improving our current understanding of the flavour of Garnacha Tintorera red wines, both base and sweet. Based on the sensory dataset analysis, the descriptors with the highest weight for separating base and sweet wines from Garnacha Tintorera were sweetness, dried fruit and caramel (for sweet wines) vs. bitterness, astringency and geranium (for base wines). Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Trends in population-based studies of human genetics in infectious diseases.

    Science.gov (United States)

    Rowell, Jessica L; Dowling, Nicole F; Yu, Wei; Yesupriya, Ajay; Zhang, Lyna; Gwinn, Marta

    2012-01-01

    Pathogen genetics is already a mainstay of public health investigation and control efforts; now advances in technology make it possible to investigate the role of human genetic variation in the epidemiology of infectious diseases. To describe trends in this field, we analyzed articles that were published from 2001 through 2010 and indexed by the HuGE Navigator, a curated online database of PubMed abstracts in human genome epidemiology. We extracted the principal findings from all meta-analyses and genome-wide association studies (GWAS) with an infectious disease-related outcome. Finally, we compared the representation of diseases in HuGE Navigator with their contributions to morbidity worldwide. We identified 3,730 articles on infectious diseases, including 27 meta-analyses and 23 GWAS. The number published each year increased from 148 in 2001 to 543 in 2010 but remained a small fraction (about 7%) of all studies in human genome epidemiology. Most articles were by authors from developed countries, but the percentage by authors from resource-limited countries increased from 9% to 25% during the period studied. The most commonly studied diseases were HIV/AIDS, tuberculosis, hepatitis B infection, hepatitis C infection, sepsis, and malaria. As genomic research methods become more affordable and accessible, population-based research on infectious diseases will be able to examine the role of variation in human as well as pathogen genomes. This approach offers new opportunities for understanding infectious disease susceptibility, severity, treatment, control, and prevention.

  5. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  6. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  7. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-06-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts` ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  8. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-01-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts' ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  9. Creativity and borderline personality disorder: evidence from a voxel-based morphometry study.

    Science.gov (United States)

    Leutgeb, Verena; Ille, Rottraut; Wabnegger, Albert; Schienle, Anne; Schöggl, Helmut; Weber, Bernhard; Papousek, Ilona; Weiss, Elisabeth M; Fink, Andreas

    2016-05-01

    Throughout the history, various examples of eminent creative people suffering from mental disorders along with some empirical research reports strengthened the idea of a potential link between creativity and psychopathology. This study investigated different facets of psychometrically determined creativity in 20 females diagnosed with borderline personality disorder (BPD) relative to 19 healthy female controls. In addition, group differences in grey matter (GM) were examined. Behavioural findings revealed no significant differences between the BPD group and healthy controls with respect to verbal and figural-graphic creative task performance and creativity-related personality characteristics. Whole-brain voxel-based morphometry analyses revealed a distinct pattern of GM reductions in the BPD group (relative to controls) in a network of brain regions closely associated with various cognitive and emotional functions (including the bilateral orbital inferior frontal gyri and the left superior temporal gyrus), partly overlapping with creativity-related brain regions. Correlation analyses moreover revealed that in the BPD group GM reductions in the orbital parts of the inferior and middle frontal gyri were associated with lower levels of creativity. This study provides no indications in favour of the putative link between creativity and psychopathology, as sometimes reported in the literature.

  10. Diffraction Studies from Minerals to Organics - Lessons Learned from Materials Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Whitfield, Pamela S [ORNL

    2014-01-01

    In many regards the study of materials and minerals by powder diffraction techniques are complimentary, with techniques honed in one field equally applicable to the other. As a long-time materials researcher many of the examples are of techniques developed for materials analysis applied to minerals. However in a couple of cases the study of new minerals was the initiation into techniques later used in materials-based studies. Hopefully they will show that the study of new minerals structures can provide opportunities to add new methodologies and approaches to future problems. In keeping with the AXAA many of the examples have an Australian connection, the materials ranging from organics to battery materials.

  11. Theoretical study of the structure and reactivity of lanthanide and actinide based organometallic complexes

    International Nuclear Information System (INIS)

    Barros, N.

    2007-06-01

    In this PhD thesis, lanthanide and actinide based organometallic complexes are studied using quantum chemistry methods. In a first part, the catalytic properties of organo-lanthanide compounds are evaluated by studying two types of reactions: the catalytic hydro-functionalization of olefins and the polymerisation of polar monomers. The reaction mechanisms are theoretically determined and validated, and the influence of possible secondary non productive reactions is envisaged. A second part focuses on uranium-based complexes. Firstly, the electronic structure of uranium metallocenes is analysed. An analogy with the uranyl compounds is proposed. In a second chapter, two isoelectronic complexes of uranium IV are studied. After validating the use of DFT methods for describing the electronic structure and the reactivity of these compounds, it is shown that their reactivity difference can be related to a different nature of chemical bonding in these complexes. (author)

  12. Wind Power Forecasting Error Frequency Analyses for Operational Power System Studies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, A.; Hodge, B. M.; Milligan, M.

    2012-08-01

    The examination of wind power forecasting errors is crucial for optimal unit commitment and economic dispatch of power systems with significant wind power penetrations. This scheduling process includes both renewable and nonrenewable generators, and the incorporation of wind power forecasts will become increasingly important as wind fleets constitute a larger portion of generation portfolios. This research considers the Western Wind and Solar Integration Study database of wind power forecasts and numerical actualizations. This database comprises more than 30,000 locations spread over the western United States, with a total wind power capacity of 960 GW. Error analyses for individual sites and for specific balancing areas are performed using the database, quantifying the fit to theoretical distributions through goodness-of-fit metrics. Insights into wind-power forecasting error distributions are established for various levels of temporal and spatial resolution, contrasts made among the frequency distribution alternatives, and recommendations put forth for harnessing the results. Empirical data are used to produce more realistic site-level forecasts than previously employed, such that higher resolution operational studies are possible. This research feeds into a larger work of renewable integration through the links wind power forecasting has with various operational issues, such as stochastic unit commitment and flexible reserve level determination.

  13. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Science.gov (United States)

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  14. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Radiation physics and shielding codes and analyses applied to design-assist and safety analyses of CANDUR and ACRTM reactors

    International Nuclear Information System (INIS)

    Aydogdu, K.; Boss, C. R.

    2006-01-01

    heavily on experience and engineering judgement, consistent with the ALARA philosophy. Special care is taken to ensure that the best estimate dose rates are used to the extent possible when applying ALARA. Provisions for safeguards equipment are made throughout the fuel-handling route in CANDU and ACR reactors. For example, the fuel bundle counters rely on the decay gammas from the fission products in spent-fuel bundles to record the number of fuel movements. The International Atomic Energy Agency (IAEA) Safeguards system for CANDU and ACR reactors is based on item (fuel bundle) accounting. It involves a combination of IAEA inspection with containment and surveillance, and continuous unattended monitoring. The spent fuel bundle counter monitors spent fuel bundles as they are transferred from the fuelling machine to the spent fuel bay. The shielding and dose-rate analysis need to be carried out so that the bundle counter functions properly. This paper includes two codes used in criticality safety analyses. Criticality safety is a unique phenomenon and codes that address criticality issues will demand specific validations. However, it is recognised that some of the codes used in radiation physics will also be used in criticality safety assessments. (authors)

  16. Remote sensing and spatial analysis based study for detecting deforestation and the associated drivers

    Science.gov (United States)

    El-Abbas, Mustafa M.; Csaplovics, Elmar; Deafalla, Taisser H.

    2013-10-01

    Nowadays, remote-sensing technologies are becoming increasingly interlinked to the issue of deforestation. They offer a systematized and objective strategy to document, understand and simulate the deforestation process and its associated causes. In this context, the main goal of this study, conducted in the Blue Nile region of Sudan, in which most of the natural habitats were dramatically destroyed, was to develop spatial methodologies to assess the deforestation dynamics and its associated factors. To achieve that, optical multispectral satellite scenes (i.e., ASTER and LANDSAT) integrated with field survey in addition to multiple data sources were used for the analyses. Spatiotemporal Object Based Image Analysis (STOBIA) was applied to assess the change dynamics within the period of study. Broadly, the above mentioned analyses include; Object Based (OB) classifications, post-classification change detection, data fusion, information extraction and spatial analysis. Hierarchical multi-scale segmentation thresholds were applied and each class was delimited with semantic meanings by a set of rules associated with membership functions. Consequently, the fused multi-temporal data were introduced to create detailed objects of change classes from the input LU/LC classes. The dynamic changes were quantified and spatially located as well as the spatial and contextual relations from adjacent areas were analyzed. The main finding of the present study is that, the forest areas were drastically decreased, while the agrarian structure in conversion of forest into agricultural fields and grassland was the main force of deforestation. In contrast, the capability of the area to recover was clearly observed. The study concludes with a brief assessment of an 'oriented' framework, focused on the alarming areas where serious dynamics are located and where urgent plans and interventions are most critical, guided with potential solutions based on the identified driving forces.

  17. Effect of periodontal treatment on preterm birth rate: a systematic review of meta-analyses.

    Science.gov (United States)

    López, Néstor J; Uribe, Sergio; Martinez, Benjamín

    2015-02-01

    Preterm birth is a major cause of neonatal morbidity and mortality in both developed and developing countries. Preterm birth is a highly complex syndrome that includes distinct clinical subtypes in which many different causes may be involved. The results of epidemiological, molecular, microbiological and animal-model studies support a positive association between maternal periodontal disease and preterm birth. However, the results of intervention studies carried out to determine the effect of periodontal treatment on reducing the risk of preterm birth are controversial. This systematic review critically analyzes the methodological issues of meta-analyses of the studies to determine the effect of periodontal treatment to reduce preterm birth. The quality of the individual randomized clinical trials selected is of highest relevance for a systematic review. This article describes the methodological features that should be identified a priori and assessed individually to determine the quality of a randomized controlled trial performed to evaluate the effect of periodontal treatment on pregnancy outcomes. The AMSTAR and the PRISMA checklist tools were used to assess the quality of the six meta-analyses selected, and the bias domain of the Cochrane Collaboration's Tool was applied to evaluate each of the trials included in the meta-analyses. In addition, the methodological characteristics of each clinical trial were assessed. The majority of the trials included in the meta-analyses have significant methodological flaws that threaten their internal validity. The lack of effect of periodontal treatment on preterm birth rate concluded by four meta-analyses, and the positive effect of treatment for reducing preterm birth risk concluded by the remaining two meta-analyses are not based on consistent scientific evidence. Well-conducted randomized controlled trials using rigorous methodology, including appropriate definition of the exposure, adequate control of confounders for

  18. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  19. Northern Marshall Islands Radiological Survey: a quality-control program for a radiochemical analyses

    International Nuclear Information System (INIS)

    Jennings, C.D.; Mount, M.E.

    1983-08-01

    More than 16,000 radiochemical analyses were performed on about 5400 samples of soils, vegetation, animals, fish, invertebrates, and water to establish amounts of 90 Sr, 137 Cs, 241 Am, and plutonium isotopes in the Northern Marshall Islands. Three laboratories were contracted by Lawrence Livermore National Laboratory to perform the radiochemical analyses: Environmental Analysis Laboratory (EAL), Richmond, California; Eberline Instrument Corporation (EIC), Albuquerque, New Mexico; and Laboratory of Radiation Ecology (LRE), University of Washington, Seattle, Washington. The analytical precision and accuracy were monitored by regularly including duplicate samples and natural matrix standards in each group of about 100 samples analyzed. Based on the duplicates and standards, over 83% of the radiochemical analyses in this survey were acceptable - 97% of the analyses by EAL, 45% of the analyses by EIC, and 98% of the analyses by LRE

  20. Implementation of analyses based on social media data for marketing purposes in academic and scientific organizations in practice – opportunities and limitations

    Directory of Open Access Journals (Sweden)

    Magdalena Grabarczyk-Tokaj

    2013-12-01

    Full Text Available The article is focused on the issue of practice use of analyses, based on data collected in social media, for institutions’ communication and marketing purposes. The subject is being discussed from the perspective of Digital Darwinism — situation, when development of technologies and new means of communication is significantly faster than growth in the knowledge and digital skills among organizations eager to implement those solutions. To diminish negative consequences of Digital Darwinism institutions can broaden their knowledge with analyses of data from cyber space to optimize operations, and make use of running dialog and cooperation with prosuments to face dynamic changes in trends, technologies and society. Information acquired from social media user generated content can be employed as guidelines in planning, running and evaluating communication and marketing activities. The article presents examples of tools and solutions, that can be implement in practice as a support for actions taken by institutions.

  1. Ab Initio Calculations and Raman and SERS Spectral Analyses of Amphetamine Species

    DEFF Research Database (Denmark)

    Berg, Rolf W.; Nørbygaard, Thomas; White, Peter C.

    2011-01-01

    For the first time, the differences between the spectra of amphetamine and amphetamine-H+ and between different conformers are thoroughly studied by ab initio model calculations, and Raman and surface-enhanced Raman spectroscopy (SERS) spectra are measured for different species of amphetamine....... The spectra of amphetamine and amphetamine-H+ sampleswere obtained and assigned according to a comparison of the experimental spectra and the ab initio MO calculations, performed using the Gaussian 03W program (Gaussian, Inc., Pittsburgh, PA). The analyses were based on complete geometry minimization...

  2. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  3. Social relations and healthcare utilisation among middle-aged and older people: study protocol for an implementation and register-based study in Denmark

    Directory of Open Access Journals (Sweden)

    Anne Sophie Bech Mikkelsen

    2017-11-01

    Full Text Available Abstract Background While previous research establishes an association between social relations, health and use of healthcare services among older people, how to implement this knowledge in real-life settings has received much less attention. This study will explore the relationship between social relations, health and use of healthcare services in a Danish mid-life population sample. In addition, the study will explore individual and contextual factors affecting the implementation of a group-based life story intervention aimed at establishing and strengthening social relations among older people at nursing homes in Denmark. Methods/design A combined quantitative register-based approach and a qualitative implementation approach will be applied in this study. First, we will quantitatively analyse the relationship between social relations, health status and use of healthcare services among middle-aged people in Denmark by linking survey data on social relations, loneliness, self-perceived health and disease status from the Copenhagen Aging and Midlife Biobank (CAMB (n = 7191 with national registries through the Public Health Database on use of healthcare services and demographic and socioeconomic factors. Second, we will qualitatively analyse individual and contextual factors affecting the implementation process of the group-based life story intervention based on semi-structured interviews (n = 16, observations and field notes with and among intervention stakeholders, i.e., participants and group leaders facilitating the intervention. Discussion The results of this study are expected to improve knowledge about mechanisms through which social relations are associated with health status and use of healthcare services and to inform the implementation of future interventions targeting social relations among older people at nursing homes. Trial registration The study has been registered and approved by the Danish Data Protection Agency. Seperate

  4. Social relations and healthcare utilisation among middle-aged and older people: study protocol for an implementation and register-based study in Denmark.

    Science.gov (United States)

    Mikkelsen, Anne Sophie Bech; Lund, Rikke; Kristiansen, Maria

    2017-11-15

    While previous research establishes an association between social relations, health and use of healthcare services among older people, how to implement this knowledge in real-life settings has received much less attention. This study will explore the relationship between social relations, health and use of healthcare services in a Danish mid-life population sample. In addition, the study will explore individual and contextual factors affecting the implementation of a group-based life story intervention aimed at establishing and strengthening social relations among older people at nursing homes in Denmark. A combined quantitative register-based approach and a qualitative implementation approach will be applied in this study. First, we will quantitatively analyse the relationship between social relations, health status and use of healthcare services among middle-aged people in Denmark by linking survey data on social relations, loneliness, self-perceived health and disease status from the Copenhagen Aging and Midlife Biobank (CAMB) (n = 7191) with national registries through the Public Health Database on use of healthcare services and demographic and socioeconomic factors. Second, we will qualitatively analyse individual and contextual factors affecting the implementation process of the group-based life story intervention based on semi-structured interviews (n = 16), observations and field notes with and among intervention stakeholders, i.e., participants and group leaders facilitating the intervention. The results of this study are expected to improve knowledge about mechanisms through which social relations are associated with health status and use of healthcare services and to inform the implementation of future interventions targeting social relations among older people at nursing homes. The study has been registered and approved by the Danish Data Protection Agency. Seperate approvals have been attained for the qualitative data (Approval No. SUND-2016

  5. Treatment of visceral leishmaniasis: model-based analyses on the spread of antimony-resistant L. donovani in Bihar, India.

    Directory of Open Access Journals (Sweden)

    Anette Stauch

    Full Text Available BACKGROUND: Pentavalent antimonials have been the mainstay of antileishmanial therapy for decades, but increasing failure rates under antimonial treatment have challenged further use of these drugs in the Indian subcontinent. Experimental evidence has suggested that parasites which are resistant against antimonials have superior survival skills than sensitive ones even in the absence of antimonial treatment. METHODS AND FINDINGS: We use simulation studies based on a mathematical L. donovani transmission model to identify parameters which can explain why treatment failure rates under antimonial treatment increased up to 65% in Bihar between 1980 and 1997. Model analyses suggest that resistance to treatment alone cannot explain the observed treatment failure rates. We explore two hypotheses referring to an increased fitness of antimony-resistant parasites: the additional fitness is (i disease-related, by causing more clinical cases (higher pathogenicity or more severe disease (higher virulence, or (ii is transmission-related, by increasing the transmissibility from sand flies to humans or vice versa. CONCLUSIONS: Both hypotheses can potentially explain the Bihar observations. However, increased transmissibility as an explanation appears more plausible because it can occur in the background of asymptomatically transmitted infection whereas disease-related factors would most probably be observable. Irrespective of the cause of fitness, parasites with a higher fitness will finally replace sensitive parasites, even if antimonials are replaced by another drug.

  6. Active teaching methods, studying responses and learning

    DEFF Research Database (Denmark)

    Christensen, Hans Peter; Vigild, Martin Etchells; Thomsen, Erik Vilain

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching.......Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching....

  7. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  8. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Phylogenomic analyses data of the avian phylogenomics project

    DEFF Research Database (Denmark)

    Jarvis, Erich D; Mirarab, Siavash; Aberer, Andre J

    2015-01-01

    BACKGROUND: Determining the evolutionary relationships among the major lineages of extant birds has been one of the biggest challenges in systematic biology. To address this challenge, we assembled or collected the genomes of 48 avian species spanning most orders of birds, including all Neognathae...... and two of the five Palaeognathae orders. We used these genomes to construct a genome-scale avian phylogenetic tree and perform comparative genomic analyses. FINDINGS: Here we present the datasets associated with the phylogenomic analyses, which include sequence alignment files consisting of nucleotides......ML algorithm or when using statistical binning with the coalescence-based MP-EST algorithm (which we refer to as MP-EST*). Other data sets, such as the coding sequence of some exons, revealed other properties of genome evolution, namely convergence. CONCLUSIONS: The Avian Phylogenomics Project is the largest...

  10. Data analyses and modelling for risk based monitoring of mycotoxins in animal feed

    NARCIS (Netherlands)

    Ine van der Fels-Klerx, H.J.; Adamse, Paulien; Punt, Ans; Asselt, van Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study

  11. Thermal analyses for the design of the ITER-NBI arc driven ion source

    International Nuclear Information System (INIS)

    Anaclerio, G.; Peruzzo, S.; Dal Bello, S.; Palma, M.D.; Nocentini, R.; Zaccaria, P.

    2006-01-01

    The design of the first ITER NB Injector and the ITER NB Test Facility is presently in progress in the framework of EFDA contracts with the contribution of several European Associations. One of the components currently studied by Consorzio RFX Team is the arc driven negative ion source, which is designed to produce a D - beam of 40 A at 1 MeV for 3600 s pulses, generated in the ion source via a surface production process in a caesium-seeded arc discharge of 790 kW total power. This paper will focus in particular on the thermal analyses carried out in order to evaluate the thermal behaviour in nominal operating conditions of the main components of the ion source: the arc-chamber and the filament cassette assembly. The study is based on hydraulic, thermo-mechanical and thermo-electrical calculations performed by means of 2D and 3D finite element models, with inputs coming partly from the ITER reference design documentation and partly from the design review activities presently in progress. Moreover a complete modelling of all the components of the beam source assembly by means of new 3D CAD models was carried out to demonstrate the feasibility of the proposed design. For the arc chamber, an assessment of the cooling circuit has been performed and hydraulic analyses have been carried out to calculate water flow rates and pressures inside the cooling channels. Thermo-mechanical analyses have been carried out considering several load cases and different water flow rates. The maximum and average temperatures of the arc chamber walls have been calculated to verify the operational conditions and the fulfilment of physics requirements for the negative ion generation. For the filament cassette assembly, an assessment of the effectiveness of the cooling system has been carried out considering two different design solutions: the first based on the reference design, with a dedicated active cooling system integrated in the filament cassette; the other based on a simplified

  12. Quantifying and analysing food waste generated by Indonesian undergraduate students

    Science.gov (United States)

    Mandasari, P.

    2018-03-01

    Despite the fact that environmental consequences derived from food waste have been widely known, studies on the amount of food waste and its influencing factors have relatively been paid little attention. Addressing this shortage, this paper aimed to quantify monthly avoidable food waste generated by Indonesian undergraduate students and analyse factors influencing the occurrence of avoidable food waste. Based on data from 106 undergraduate students, descriptive statistics and logistic regression were applied in this study. The results indicated that 4,987.5 g of food waste was generated in a month (equal to 59,850 g yearly); or 47.05 g per person monthly (equal to 564.62 g per person per a year). Meanwhile, eating out frequency and gender were found to be significant predictors of food waste occurrence.

  13. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences

  14. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  15. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  16. Handbook of methods for risk-based analyses of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Samanta, P.K.; Kim, I.S. [Brookhaven National Lab., Upton, NY (United States); Mankamo, T. [Avaplan Oy, Espoo (Finland); Vesely, W.E. [Science Applications International Corp., Dublin, OH (United States)

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  17. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  18. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  19. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  20. Shock Transmission Analyses of a Simplified Frigate Compartment Using LS-DYNA

    National Research Council Canada - National Science Library

    Trouwborst, W

    1999-01-01

    This report gives results as obtained with finite element analyses using the explicit finite element program LS-DYNA for a longitudinal slice of a frigate's compartment loaded with a shock pulse based...