WorldWideScience

Sample records for analyser based spectromicroscope

  1. Spectromicroscope for the PHotoelectron Imaging of Nanostructures with X-rays (SPHINX): performance in biology, medicine and geology

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, B.H.; Girasole, Marco; Wiese, L.M.; Franz, Torsten; De Stasio, G

    2004-05-15

    Several X-ray PhotoElectron Emission spectroMicroscopes (X-PEEMs) exist around the world at this time. We present recent performance and resolution tests of one of them, the Spectromicroscope for PHotoelectron Imaging of Nanostructures with X-rays (SPHINX) X-PEEM, installed at the University of Wisconsin Synchrotron Radiation Center. With this state-of-the-art instrument we demonstrate chemical analysis capabilities on conducting and insulating specimens of diverse interests, and an unprecedented lateral resolution of 10 nm with monochromatic X-rays and 7.2 nm with ultraviolet illumination.

  2. Unsupervised Data Mining in nanoscale X-ray Spectro-Microscopic Study of NdFeB Magnet.

    Science.gov (United States)

    Duan, Xiaoyue; Yang, Feifei; Antono, Erin; Yang, Wenge; Pianetta, Piero; Ermon, Stefano; Mehta, Apurva; Liu, Yijin

    2016-09-29

    Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanent magnet material, Nd 2 Fe 14 B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For example, it shows that the surface of common Nd 2 Fe 14 B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material's overall properties.

  3. Unsupervised Data Mining in nanoscale X-ray Spectro-Microscopic Study of NdFeB Magnet

    Science.gov (United States)

    Duan, Xiaoyue; Yang, Feifei; Antono, Erin; Yang, Wenge; Pianetta, Piero; Ermon, Stefano; Mehta, Apurva; Liu, Yijin

    2016-09-01

    Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanent magnet material, Nd2Fe14B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For example, it shows that the surface of common Nd2Fe14B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material’s overall properties.

  4. Energy-filtered real- and k-space secondary and energy-loss electron imaging with Dual Emission Electron spectro-Microscope: Cs/Mo(110)

    Energy Technology Data Exchange (ETDEWEB)

    Grzelakowski, Krzysztof P., E-mail: k.grzelakowski@opticon-nanotechnology.com

    2016-05-15

    Since its introduction the importance of complementary k{sub ||}-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800 eV electron beam from an “in-lens” electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered k{sub ǁ}-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. - Highlights: • A novel concept of the electron sample illumination with “in-lens” e- gun is realized. • Quasi-simultaneous energy selective observation of the real- and k-space in EELS mode. • Observation of the energy filtered Auger electron diffraction at Cs atoms on Mo(110). • Energy-loss, Auger and secondary electron momentum microscopy is realized.

  5. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  6. Instrumentation for BESSY II and temperature programmed desorption of CO, NO and water of the (100)-single crystal cleavage planes of the metal oxides NiO and MgO

    OpenAIRE

    Wichtendahl, Ralph

    2010-01-01

    For the spectromicroscope SMART, an independent preparation chamber and a vibration isolation system accurate to 2 mm in spite of lateral forces have been set up. Placed in a row with the SMART at BESSY, a spectrometer system has been built. It contains a high-resolution energy analyser, a partial-yield- and a fluorescence detector, sample preparation tools and helium cooling. XPS investigations on tantalum deposits on Al2O3/NiAl(110) thin films revealed that submonolayer deposits show ...

  7. Real-time Bacterial Detection by Single Cell Based Sensors UsingSynchrotron FTIR Spectromicroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Veiseh, Mandana; Veiseh, Omid; Martin, Michael C.; Bertozzi,Carolyn; Zhang, Miqin

    2005-08-10

    Microarrays of single macrophage cell based sensors weredeveloped and demonstrated for real time bacterium detection bysynchrotron FTIR microscopy. The cells were patterned on gold-SiO2substrates via a surface engineering technique by which the goldelectrodes were immobilized with fibronectin to mediate cell adhesion andthe silicon oxide background were passivated with PEG to resist proteinadsorption and cell adhesion. Cellular morphology and IR spectra ofsingle, double, and triple cells on gold electrodes exposed tolipopolysaccharide (LPS) of different concentrations were compared toreveal the detection capabilities of these biosensors. The single-cellbased sensors were found to generate the most significant IR wave numbervariation and thus provide the highest detection sensitivity. Changes inmorphology and IR spectrum for single cells exposed to LPS were found tobe time- and concentration-dependent and correlated with each other verywell. FTIR spectra from single cell arrays of gold electrodes withsurface area of 25 mu-m2, 100 mu-m2, and 400 mu-m2 were acquired usingboth synchrotron and conventional FTIR spectromicroscopes to study thesensitivity of detection. The results indicated that the developedsingle-cell platform can be used with conventional FTIRspectromicroscopy. This technique provides real-time, label-free, andrapid bacterial detection, and may allow for statistic and highthroughput analyses, and portability.

  8. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  9. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  10. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  11. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies

    DEFF Research Database (Denmark)

    Gyöngyösi, Mariann; Wojakowski, Wojciech; Navarese, Eliano P

    2016-01-01

    In contrast to multiple publication-based meta-analyses involving clinical cardiac regeneration therapy in patients with recent myocardial infarction, a recently published meta-analysis based on individual patient data reported no effect of cell therapy on left ventricular function or clinical...

  12. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  13. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  14. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  15. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  16. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  17. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  18. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  19. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  20. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  1. In vivo biodegradation of colloidal quantum dots by a freshwater invertebrate, Daphnia magna

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Dongwook; Kim, Min Jung; Park, Chansik; Park, Jaehong [Department of Chemistry, Hanyang University, Seoul 133-791 (Korea, Republic of); Choi, Kyungho [Department of Environmental Health, Seoul National University, Seoul 151-742 (Korea, Republic of); Yoon, Tae Hyun, E-mail: thyoon@gmail.com [Department of Chemistry, Hanyang University, Seoul 133-791 (Korea, Republic of)

    2012-06-15

    Impacts of planktonic invertebrate, Daphnia magna, on the speciation of colloidal quantum dots (QD) were investigated using fluorescence spectromicroscopic technique. Well-dispersed {sup GA/TOPO}QD were prepared by forming a supramolecular assembly of hydrophobic {sup TOPO}QD with biomacromolecules (i.e., Gum Arabic, GA). Biological degradation of this nanomaterial was monitored by fluorescence spectromicroscopic methods. Our study confirmed the major uptake pathway of manufactured nanomaterials and in vivo biodegradation processes in a well-known toxicity test organism, D. magna. In addition, we also found that D. magna can induce significant deterioration of aquatic media by releasing fragments of partially degraded QD colloids. These biological processes may significantly change the predicted toxicities of nanomaterials in aquatic environments. Thus, we propose that the impacts of aquatic living organisms on the environmental fate of manufactured nanomaterials (MNs) should be carefully taken into account when assessing the risk of MNs to the environment and human health.

  2. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  3. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  4. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  5. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    Science.gov (United States)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-10-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  6. Inversion domain boundaries in GaN studied by X-ray microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Criado, Gema; Tucoulou, Remi; Cloetens, Peter; Sans, Juan Angel; Susini, Jean [European Synchrotron Radiation Facility, Experiments Division, Grenoble (France); Somogyi, Andrea [Experiments Division, Synchrotron SOLEIL, Gif-sur-Yvette (France); Alen, Benito [Microelectronics Institute Madrid, CNM-CSIC, Madrid (Spain); Miskys, Claudio [Walter Schottky Institute, Technical University Munich, Garching (Germany)

    2010-02-15

    In this study, we report on the application of synchrotron spectro-microscopic techniques to the examination of inversion domain boundaries formed intentionally in a GaN-based lateral polarity heterostructure. Using X-ray sub-microbeams, no evidence of field-driven electrodiffusion effects has been observed on spatially separated inversion domain boundaries. In addition, XANES data around the Ga K-edge strongly supported hexagonal Ga site configurations, suggesting high local order reconstruction. Based on inner-shell excited luminescence on the micrometer scale, the uniform spectral distribution of the radiative centers was discussed. (copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  7. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  8. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  9. Spectro-microscopic study of the formation of supramolecular networks

    Science.gov (United States)

    Sadowski, Jerzy T.

    2015-03-01

    Metal-organic frameworks (MOFs) are emerging as a new class of materials for CO2 capture. There are many fundamental questions, including the optimum pore size and arrangement of the molecules in the structure to achieve highest CO2 uptake. As only the surface is of interest for potential applications such as heterogeneous catalysis, nano-templating, and sensing, 2D analogs of MOFs can serve as good model systems. Utilizing capabilities of LEEM/PEEM for non-destructive interrogation of the real-time molecular self-assembly, we investigated supramolecular systems based on carboxylic acid-metal complexes, such as trimesic and mellitic acid, doped with transition metals. Such 2D networks act as host systems for transition-metal phthalocyanines (MPc; M = Fe, Ti, Sc) and the electrostatic interactions of CO2 molecules with transition metal ions, can be tuned by controlling the type of TM ion and the size of the pore in the host network. The understanding of directed self-assembly by controlling the molecule-substrate interaction can enable us to engineer the pore size and density, and thus tune the host's chemical activity. Research carried out at the Center for Functional Nanomaterials and National Synchrotron Light Source, Brookhaven National Laboratory, which are supported by the U.S. Department of Energy, Office of Basic Energy Sciences, under Contract No. DE-AC02-98CH10.

  10. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  11. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  12. How distributed processing produces false negatives in voxel-based lesion-deficit analyses.

    Science.gov (United States)

    Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J

    2018-07-01

    In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients

  13. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  14. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  15. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  16. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  17. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  18. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  19. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  20. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  1. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  3. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms.

    Science.gov (United States)

    Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T

    2016-05-23

    Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.

  4. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  5. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    Science.gov (United States)

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  6. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  7. Conducting Meta-Analyses Based on p Values

    Science.gov (United States)

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  8. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  9. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  10. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  11. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  12. Direct observation of organic layer growth by dynamic spectro-microscopy using high-brilliance synchrotron

    International Nuclear Information System (INIS)

    Umbach, E.

    2004-01-01

    It was always the dream of scientists to watch microscopic objects directly on an atomic scale, to follow their dynamical behaviour, and to know everything about them, i.e. to get as much spectroscopic information as possible. While instruments have become available which may fulfill two of these wishes simultaneously, it is much more difficult to get all three at once. The development of so called spectro-microscopes which operate at 3rd generation synchrotron sources nourishes the hope that this dream will become true in the near future. The talk intends to show how much can be learned about organic thin films and interfaces if high-brilliance synchrotron radiation is combined with new instruments, for instance a high energy resolution beamline and a high-spatial resolution spectro-microscope. While the former is standard technology of today, the latter is a new development, combining brilliant undulator radiation of variable polarization with a specially developed, energy-filtered low energy electron microscope. First, it will be shown that many new details about the electronic structure of organic materials and their interaction with one another or with an interface can be obtained using high-resolution photoemission and x-ray absorption. For instance, from a careful analysis of the fine structure of photoemission spectra one can derive details about the interface bonding, about the interaction between molecules, and about the dynamic response of the molecular system upon creation of a core hole. Or, from a careful analysis of the fine structure of high resolution x-ray absorption spectra one gets insight into the intermolecular interaction, the coupling between electronic and vibronic excitations, and even about the shapes of potential curves. Second, the dynamic growth of highly-ordered organic thin films will be followed as a function of molecule and preparation conditions. The formation of islands, the inner structure of organic crystallites, diffusion

  13. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    International Nuclear Information System (INIS)

    Kitchen, Marcus J.; Pavlov, Konstantin M.; Hooper, Stuart B.; Vine, David J.; Siu, Karen K.W.; Wallace, Megan J.; Siew, Melissa L.L.; Yagi, Naoto; Uesugi, Kentaro; Lewis, Rob A.

    2008-01-01

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 μm thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution

  14. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kitchen, Marcus J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: Marcus.Kitchen@sci.monash.edu.au; Pavlov, Konstantin M. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia); Physics and Electronics, School of Science and Technology, University of New England, NSW 2351 (Australia)], E-mail: Konstantin.Pavlov@sci.monash.edu.au; Hooper, Stuart B. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Stuart.Hooper@med.monash.edu.au; Vine, David J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: David.Vine@sci.monash.edu.au; Siu, Karen K.W. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Karen.Siu@sci.monash.edu.au; Wallace, Megan J. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Megan.Wallace@med.monash.edu.au; Siew, Melissa L.L. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Melissa.Siew@med.monash.edu.au; Yagi, Naoto [SPring-8/JASRI, Sayo (Japan)], E-mail: yagi@spring8.or.jp; Uesugi, Kentaro [SPring-8/JASRI, Sayo (Japan)], E-mail: ueken@spring8.or.jp; Lewis, Rob A. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Rob.Lewis@sync.monash.edu.au

    2008-12-15

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 {mu}m thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution.

  15. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Orbitrap-based mass analyser for in-situ characterization of asteroids: ILMA, Ion Laser Mass Analyser

    Science.gov (United States)

    Briois, C.; Cotti, H.; Thirkell, L.; Space Orbitrap Consortium[K. Aradj, French; Bouabdellah, A.; Boukrara, A.; Carrasco, N.; Chalumeau, G.; Chapelon, O.; Colin, F.; Coll, P.; Engrand, C.; Grand, N.; Kukui, A.; Lebreton, J.-P.; Pennanech, C.; Szopa, C.; Thissen, R.; Vuitton, V.; Zapf], P.; Makarov, A.

    2014-07-01

    Since about a decade the boundaries between comets and carbonaceous asteroids are fading [1,2]. No doubt that the Rosetta mission should bring a new wealth of data on the composition of comets. But as promising as it may look, the mass resolving power of the mass spectrometers onboard (so far the best on a space mission) will only be able to partially account for the diversity of chemical structures present. ILMA (Ion-Laser Mass Analyser) is a new generation high mass resolution LDI-MS (Laser Desorption-Ionization Mass Spectrometer) instrument concept using the Orbitrap technique, which has been developed in the frame of the two Marco Polo & Marco Polo-R proposals to the ESA Cosmic Vision program. Flagged by ESA as an instrument concept of interest for the mission in 2012, it has been under study for a few years in the frame of a Research and Technology (R&T) development programme between 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) [3,4], partly funded by the French Space Agency (CNES). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercialises Orbitrap-based laboratory instruments. The R&T activities are currently concentrating on the core elements of the Orbitrap analyser that are required to reach a sufficient maturity level for allowing design studies of future space instruments. A prototype is under development at LPC2E and a mass resolution (m/Δm FWHM) of 100,000 as been obtained at m/z = 150 for a background pressure of 10^{-8} mbar. ILMA would be a key instrument to measure the molecular, elemental and isotopic composition of objects such as carbonaceous asteroids, comets, or other bodies devoid of atmosphere such as the surface of an icy satellite, the Moon, or Mercury.

  17. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  18. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  19. Individual-based analyses reveal limited functional overlap in a coral reef fish community.

    Science.gov (United States)

    Brandl, Simon J; Bellwood, David R

    2014-05-01

    Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on

  20. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A; Leppaemaeki, E; Koponen, P; Levander, J; Tapola, E [VTT Energy, Espoo (Finland). Energy Production Technologies

    1998-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  1. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  2. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  3. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-01-01

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  4. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  5. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  6. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  7. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-01-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  8. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  9. CrusView: a Java-based visualization platform for comparative genomics analyses in Brassicaceae species.

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-09-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/.

  10. Reviewing PSA-based analyses to modify technical specifications at nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Martinez-Guridi, G.; Vesely, W.E.

    1995-12-01

    Changes to Technical Specifications (TSs) at nuclear power plants (NPPs) require review and approval by the United States Nuclear Regulatory Commission (USNRC). Currently, many requests for changes to TSs use analyses that are based on a plant's probabilistic safety assessment (PSA). This report presents an approach to reviewing such PSA-based submittals for changes to TSs. We discuss the basic objectives of reviewing a PSA-based submittal to modify NPP TSs; the methodology of reviewing a TS submittal, and the differing roles of a PSA review, a PSA Computer Code review, and a review of a TS submittal. To illustrate this approach, we discuss our review of changes to allowed outage time (AOT) and surveillance test interval (STI) in the TS for the South Texas Project Nuclear Generating Station. Based on this experience gained, a check-list of items is given for future reviewers; it can be used to verify that the submittal contains sufficient information, and also that the review has addressed the relevant issues. Finally, recommended steps in the review process and the expected findings of each step are discussed

  11. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    Science.gov (United States)

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  12. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  13. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  14. The effect of English-language restriction on systematic review-based meta-analyses: a systematic review of empirical studies.

    Science.gov (United States)

    Morrison, Andra; Polisena, Julie; Husereau, Don; Moulton, Kristen; Clark, Michelle; Fiander, Michelle; Mierzwinski-Urban, Monika; Clifford, Tammy; Hutton, Brian; Rabb, Danielle

    2012-04-01

    The English language is generally perceived to be the universal language of science. However, the exclusive reliance on English-language studies may not represent all of the evidence. Excluding languages other than English (LOE) may introduce a language bias and lead to erroneous conclusions. We conducted a comprehensive literature search using bibliographic databases and grey literature sources. Studies were eligible for inclusion if they measured the effect of excluding randomized controlled trials (RCTs) reported in LOE from systematic review-based meta-analyses (SR/MA) for one or more outcomes. None of the included studies found major differences between summary treatment effects in English-language restricted meta-analyses and LOE-inclusive meta-analyses. Findings differed about the methodological and reporting quality of trials reported in LOE. The precision of pooled estimates improved with the inclusion of LOE trials. Overall, we found no evidence of a systematic bias from the use of language restrictions in systematic review-based meta-analyses in conventional medicine. Further research is needed to determine the impact of language restriction on systematic reviews in particular fields of medicine.

  15. Trial sequential analyses of meta-analyses of complications in laparoscopic vs. small-incision cholecystectomy: more randomized patients are needed

    DEFF Research Database (Denmark)

    Keus, Frederik; Wetterslev, Jørn; Gluud, Christian

    2010-01-01

    Conclusions based on meta-analyses of randomized trials carry a status of "truth." Methodological components may identify trials with systematic errors ("bias"). Trial sequential analysis (TSA) evaluates random errors in meta-analysis. We analyzed meta-analyses on laparoscopic vs. small-incision ......Conclusions based on meta-analyses of randomized trials carry a status of "truth." Methodological components may identify trials with systematic errors ("bias"). Trial sequential analysis (TSA) evaluates random errors in meta-analysis. We analyzed meta-analyses on laparoscopic vs. small...

  16. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    Science.gov (United States)

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  17. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  18. CrusView: A Java-Based Visualization Platform for Comparative Genomics Analyses in Brassicaceae Species[OPEN

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-01-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/. PMID:23898041

  19. Deciphering chicken gut microbial dynamics based on high-throughput 16S rRNA metagenomics analyses.

    Science.gov (United States)

    Mohd Shaufi, Mohd Asrore; Sieo, Chin Chin; Chong, Chun Wie; Gan, Han Ming; Ho, Yin Wan

    2015-01-01

    Chicken gut microbiota has paramount roles in host performance, health and immunity. Understanding the topological difference in gut microbial community composition is crucial to provide knowledge on the functions of each members of microbiota to the physiological maintenance of the host. The gut microbiota profiling of the chicken was commonly performed previously using culture-dependent and early culture-independent methods which had limited coverage and accuracy. Advances in technology based on next-generation sequencing (NGS), offers unparalleled coverage and depth in determining microbial gut dynamics. Thus, the aim of this study was to investigate the ileal and caecal microbiota development as chicken aged, which is important for future effective gut modulation. Ileal and caecal contents of broiler chicken were extracted from 7, 14, 21 and 42-day old chicken. Genomic DNA was then extracted and amplified based on V3 hyper-variable region of 16S rRNA. Bioinformatics, ecological and statistical analyses such as Principal Coordinate Analysis (PCoA) was performed in mothur software and plotted using PRIMER 6. Additional analyses for predicted metagenomes were performed through PICRUSt and STAMP software package based on Greengenes databases. A distinctive difference in bacterial communities was observed between ilea and caeca as the chicken aged (P microbial communities in the caeca were more diverse in comparison to the ilea communities. The potentially pathogenic bacteria such as Clostridium were elevated as the chicken aged and the population of beneficial microbe such as Lactobacillus was low at all intervals. On the other hand, based on predicted metagenomes analysed, clear distinction in functions and roles of gut microbiota such as gene pathways related to nutrient absorption (e.g. sugar and amino acid metabolism), and bacterial proliferation and colonization (e.g. bacterial motility proteins, two-component system and bacterial secretion system) were

  20. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  1. PC based 8K multichannel analyser for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Jain, S.K.; Gupta, J.D.; Suman Kumari, B.

    1989-01-01

    An IBM-PC based 8K multichannel analyser(MCA) has been developed which incorporates all the features of an advanced system like very high throughput for data acquisition in PHA as well as MCS modes, fast real-time display, extensive display manipulation facilities, various present controls and concurrent data processing. The compact system hardware consists of a 2 bit wide NIM module and a PC add-on card. Because of external acquisition hardware, the system after initial programming by PC can acquire data independently allowing the PC to be switched off. To attain very high throughput, the most desirable feature of an MCA, a dual-port memory architecture has been used. The asymmetric dual-port RAM, housed in the NIM module offers 24 bit parallel access to the ADC and 8 bit wide access to PC which results in fast real-time histogramic display on the monitor. PC emulation software is menu driven and user friendly. It integrates a comprehensive set of commonly required application routines for concurrent data processing. After the transfer of know-how to the Electronic Corporation of India Ltd. (ECIL), this system is bein g produced at ECIL. (author). 5 refs., 4 figs

  2. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  3. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  4. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  5. Design and development of microcontroller-based clinical chemistry analyser for measurement of various blood biochemistry parameters.

    Science.gov (United States)

    Taneja, S R; Gupta, R C; Kumar, Jagdish; Thariyan, K K; Verma, Sanjeev

    2005-01-01

    Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.

  6. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  7. Expanding the view into complex material systems: From micro-ARPES to nanoscale HAXPES

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, C.M. [Peter Gruenberg Institute (PGI-6) and JARA-FIT, Research Center Juelich, D-52425 Juelich (Germany); Fakultaet f. Physik and Center for Nanointegration Duisburg-Essen (CENIDE), Universitaet Duisburg-Essen, D-47048 Duisburg (Germany); Wiemann, C.; Patt, M. [Peter Gruenberg Institute (PGI-6) and JARA-FIT, Research Center Juelich, D-52425 Juelich (Germany); Feyer, V. [Peter Gruenberg Institute (PGI-6) and JARA-FIT, Research Center Juelich, D-52425 Juelich (Germany); Sincrotrone Trieste S.C.p.A., S.S. 14, km 163.5 in Area Science Park, 34012 Basovizza, Trieste (Italy); Plucinski, L.; Krug, I.P. [Peter Gruenberg Institute (PGI-6) and JARA-FIT, Research Center Juelich, D-52425 Juelich (Germany); Escher, M.; Weber, N.; Merkel, M. [FOCUS GmbH, D-65510 Huenstetten (Germany); Renault, O. [CEA, LETI, MINATEC Campus, 17 rue des Martyrs, 38054 Grenoble Cedex 9 (France); Barrett, N. [DSM/IRAMIS/SPCSI, CEA-Saclay, F-91191 Gif sur Yvette Cedex (France)

    2012-10-15

    The analysis of chemical and electronic states in complex and nanostructured material systems requires electron spectroscopy to be carried out with nanometer lateral resolution, i.e. nanospectroscopy. This goal can be achieved by combining a parallel imaging photoelectron emission microscope with a bandpass energy filter. In this contribution we describe selected experiments employing a dedicated spectromicroscope - the NanoESCA. This instrument has a particular emphasis on the spectroscopic aspects and enables laterally resolved photoelectron spectroscopy from the VUV up into the hard X-ray regime.

  8. Devising a New Model of Demand-Based Learning Integrated with Social Networks and Analyses of its Performance

    Directory of Open Access Journals (Sweden)

    Bekim Fetaji

    2018-02-01

    Full Text Available The focus of the research study is to devise a new model for demand based learning that will be integrated with social networks such as Facebook, twitter and other. The study investigates this by reviewing the published literature and realizes a case study analyses in order to analyze the new models’ analytical perspectives of practical implementation. The study focuses on analyzing demand-based learning and investigating how it can be improved by devising a specific model that incorporates social network use. Statistical analyses of the results of the questionnaire through research of the raised questions and hypothesis showed that there is a need for introducing new models in the teaching process. The originality stands on the prologue of the social login approach to an educational environment, whereas the approach is counted as a contribution of developing a demand-based web application, which aims to modernize the educational pattern of communication, introduce the social login approach, and increase the process of knowledge transfer as well as improve learners’ performance and skills. Insights and recommendations are provided, argumented and discussed.

  9. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  10. Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-Rich DNA, and nuclear DNA analyses

    Science.gov (United States)

    Freeman, S.; Pham, M.; Rodriguez, R.J.

    1993-01-01

    Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-rich DNA, and nuclear DNA analyses. Experimental Mycology 17, 309-322. Isolates of Colletotrichum were grouped into 10 separate species based on arbitrarily primed PCR (ap-PCR), A + T-rich DNA (AT-DNA) and nuclear DNA banding patterns. In general, the grouping of Colletotrichum isolates by these molecular approaches corresponded to that done by classical taxonomic identification, however, some exceptions were observed. PCR amplification of genomic DNA using four different primers allowed for reliable differentiation between isolates of the 10 species. HaeIII digestion patterns of AT-DNA also distinguished between species of Colletotrichum by generating species-specific band patterns. In addition, hybridization of the repetitive DNA element (GcpR1) to genomic DNA identified a unique set of Pst 1-digested nuclear DNA fragments in each of the 10 species of Colletotrichum tested. Multiple isolates of C. acutatum, C. coccodes, C. fragariae, C. lindemuthianum, C. magna, C. orbiculare, C. graminicola from maize, and C. graminicola from sorghum showed 86-100% intraspecies similarity based on ap-PCR and AT-DNA analyses. Interspecies similarity determined by ap-PCR and AT-DNA analyses varied between 0 and 33%. Three distinct banding patterns were detected in isolates of C. gloeosporioides from strawberry. Similarly, three different banding patterns were observed among isolates of C. musae from diseased banana.

  11. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  12. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  13. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  14. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  15. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  16. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  17. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  18. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    Milani, Gabriele; Valente, Marco

    2014-01-01

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  19. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  20. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  1. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  2. In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy

    Data.gov (United States)

    U.S. Environmental Protection Agency — In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy showing spectral fitting and linear...

  3. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  4. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  5. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  6. Fossil-based comparative analyses reveal ancient marine ancestry erased by extinction in ray-finned fishes.

    Science.gov (United States)

    Betancur-R, Ricardo; Ortí, Guillermo; Pyron, Robert Alexander

    2015-05-01

    The marine-freshwater boundary is a major biodiversity gradient and few groups have colonised both systems successfully. Fishes have transitioned between habitats repeatedly, diversifying in rivers, lakes and oceans over evolutionary time. However, their history of habitat colonisation and diversification is unclear based on available fossil and phylogenetic data. We estimate ancestral habitats and diversification and transition rates using a large-scale phylogeny of extant fish taxa and one containing a massive number of extinct species. Extant-only phylogenetic analyses indicate freshwater ancestry, but inclusion of fossils reveal strong evidence of marine ancestry in lineages now restricted to freshwaters. Diversification and colonisation dynamics vary asymmetrically between habitats, as marine lineages colonise and flourish in rivers more frequently than the reverse. Our study highlights the importance of including fossils in comparative analyses, showing that freshwaters have played a role as refuges for ancient fish lineages, a signal erased by extinction in extant-only phylogenies. © 2015 John Wiley & Sons Ltd/CNRS.

  7. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  8. A Systematic Review of Cardiovascular Outcomes-Based Cost-Effectiveness Analyses of Lipid-Lowering Therapies.

    Science.gov (United States)

    Wei, Ching-Yun; Quek, Ruben G W; Villa, Guillermo; Gandra, Shravanthi R; Forbes, Carol A; Ryder, Steve; Armstrong, Nigel; Deshpande, Sohan; Duffy, Steven; Kleijnen, Jos; Lindgren, Peter

    2017-03-01

    Previous reviews have evaluated economic analyses of lipid-lowering therapies using lipid levels as surrogate markers for cardiovascular disease. However, drug approval and health technology assessment agencies have stressed that surrogates should only be used in the absence of clinical endpoints. The aim of this systematic review was to identify and summarise the methodologies, weaknesses and strengths of economic models based on atherosclerotic cardiovascular disease event rates. Cost-effectiveness evaluations of lipid-lowering therapies using cardiovascular event rates in adults with hyperlipidaemia were sought in Medline, Embase, Medline In-Process, PubMed and NHS EED and conference proceedings. Search results were independently screened, extracted and quality checked by two reviewers. Searches until February 2016 retrieved 3443 records, from which 26 studies (29 publications) were selected. Twenty-two studies evaluated secondary prevention (four also assessed primary prevention), two considered only primary prevention and two included mixed primary and secondary prevention populations. Most studies (18) based treatment-effect estimates on single trials, although more recent evaluations deployed meta-analyses (5/10 over the last 10 years). Markov models (14 studies) were most commonly used and only one study employed discrete event simulation. Models varied particularly in terms of health states and treatment-effect duration. No studies used a systematic review to obtain utilities. Most studies took a healthcare perspective (21/26) and sourced resource use from key trials instead of local data. Overall, reporting quality was suboptimal. This review reveals methodological changes over time, but reporting weaknesses remain, particularly with respect to transparency of model reporting.

  9. A multi-criteria evaluation system for marine litter pollution based on statistical analyses of OSPAR beach litter monitoring time series.

    Science.gov (United States)

    Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael

    2013-12-01

    During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos

    2011-01-01

    present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case......Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...

  11. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH ?

    Energy Technology Data Exchange (ETDEWEB)

    HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.

    2002-09-19

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.

  12. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    Science.gov (United States)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  13. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    La Pointe, P.R.; Wallmann, P.; Follin, S.

    1995-09-01

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  14. Complementary Exploratory and Confirmatory Factor Analyses of the French WISC-V: Analyses Based on the Standardization Sample.

    Science.gov (United States)

    Lecerf, Thierry; Canivez, Gary L

    2017-12-28

    Interpretation of the French Wechsler Intelligence Scale for Children-Fifth Edition (French WISC-V; Wechsler, 2016a) is based on a 5-factor model including Verbal Comprehension (VC), Visual Spatial (VS), Fluid Reasoning (FR), Working Memory (WM), and Processing Speed (PS). Evidence for the French WISC-V factorial structure was established exclusively through confirmatory factor analyses (CFAs). However, as recommended by Carroll (1995); Reise (2012), and Brown (2015), factorial structure should derive from both exploratory factor analysis (EFA) and CFA. The first goal of this study was to examine the factorial structure of the French WISC-V using EFA. The 15 French WISC-V primary and secondary subtest scaled scores intercorrelation matrix was used and factor extraction criteria suggested from 1 to 4 factors. To disentangle the contribution of first- and second-order factors, the Schmid and Leiman (1957) orthogonalization transformation (SLT) was applied. Overall, no EFA evidence for 5 factors was found. Results indicated that the g factor accounted for about 67% of the common variance and that the contributions of the first-order factors were weak (3.6 to 11.9%). CFA was used to test numerous alternative models. Results indicated that bifactor models produced better fit to these data than higher-order models. Consistent with previous studies, findings suggested dominance of the general intelligence factor and that users should thus emphasize the Full Scale IQ (FSIQ) when interpreting the French WISC-V. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Stress and deflection analyses of floating roofs based on a load-modifying method

    International Nuclear Information System (INIS)

    Sun Xiushan; Liu Yinghua; Wang Jianbin; Cen Zhangzhi

    2008-01-01

    This paper proposes a load-modifying method for the stress and deflection analyses of floating roofs used in cylindrical oil storage tanks. The formulations of loads and deformations are derived according to the equilibrium analysis of floating roofs. Based on these formulations, the load-modifying method is developed to conduct a geometrically nonlinear analysis of floating roofs with the finite element (FE) simulation. In the procedure with the load-modifying method, the analysis is carried out through a series of iterative computations until a convergence is achieved within the error tolerance. Numerical examples are given to demonstrate the validity and reliability of the proposed method, which provides an effective and practical numerical solution to the design and analysis of floating roofs

  16. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  17. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    Science.gov (United States)

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  18. DESIGNING EAP MATERIALS BASED ON INTERCULTURAL CORPUS ANALYSES: THE CASE OF LOGICAL MARKERS IN RESEARCH ARTICLES

    Directory of Open Access Journals (Sweden)

    Pilar Mur Dueñas

    2009-10-01

    Full Text Available The ultimate aim of intercultural analyses in English for Academic Purposes is to help non-native scholars function successfully in the international disciplinary community in English. The aim of this paper is to show how corpus-based intercultural analyses can be useful to design EAP materials on a particular metadiscourse category, logical markers, in research article writing. The paper first describes the analysis carried out of additive, contrastive and consecutive logical markers in a corpus of research articles in English and in Spanish in a particular discipline, Business Management. Differences were found in their frequency and also in the use of each of the sub-categories. Then, five activities designed on the basis of these results are presented. They are aimed at raising Spanish Business scholars' awareness of the specific uses and pragmatic function of frequent logical markers in international research articles in English.

  19. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  20. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  1. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  2. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  3. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    Science.gov (United States)

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent

  4. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  5. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop, such as b...

  6. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin; Drillon, Gué nola; Ryu, Taewoo; Voolstra, Christian R.; Aranda, Manuel

    2017-01-01

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  7. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin

    2017-09-23

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  8. Loss of Flow Accident (LOFA) analyses using LabView-based NRR simulator

    Energy Technology Data Exchange (ETDEWEB)

    Arafa, Amany Abdel Aziz; Saleh, Hassan Ibrahim [Atomic Energy Authority, Cairo (Egypt). Radiation Engineering Dept.; Ashoub, Nagieb [Atomic Energy Authority, Cairo (Egypt). Reactor Physics Dept.

    2016-12-15

    This paper presents a generic Loss of Flow Accident (LOFA) scenario module which is integrated in the LabView-based simulator to imitate a Nuclear Research Reactor (NRR) behavior for different user defined LOFA scenarios. It also provides analyses of a LOFA of a single fuel channel and its impact on operational transactions and on the behavior of the reactor. The generic LOFA scenario module includes graphs needed to clarify the effects of the LOFA under study. Furthermore, the percentage of the loss of mass flow rate, the mode of flow reduction and the start time and transient time of LOFA are user defined to add flexibility to the LOFA scenarios. The objective of integrating such generic LOFA module is to be able to deal with such incidents and avoid their significant effects. It is also useful in the development of expertise in this area and reducing the operator training and simulations costs. The results of the implemented generic LOFA module agree well with that of COBRA-IIIC code and the earlier guidebook for this series of transients.

  9. Using a laser-based CO2 carbon isotope analyser to investigate gas transfer in geological media

    International Nuclear Information System (INIS)

    Guillon, S.; Pili, E.; Agrinier, P.

    2012-01-01

    CO 2 stable carbon isotopes are very attractive in environmental research to investigate both natural and anthropogenic carbon sources. Laser-based CO 2 carbon isotope analysis provides continuous measurement at high temporal resolution and is a promising alternative to isotope ratio mass spectrometry (IRMS). We performed a thorough assessment of a commercially available CO 2 Carbon Isotope Analyser (CCIA DLT-100, Los Gatos Research) that allows in situ measurement of C-13 in CO 2 . Using a set of reference gases of known CO 2 concentration and carbon isotopic composition, we evaluated the precision, long-term stability, temperature sensitivity and concentration dependence of the analyser. Despite good precision calculated from Allan variance (5.0 ppm for CO 2 concentration, and 0.05 per thousand for δC-13 at 60 s averaging), real performances are altered by two main sources of error: temperature sensitivity and dependence of C-13 on CO 2 concentration. Data processing is required to correct for these errors. Following application of these corrections, we achieve an accuracy of 8.7 ppm for CO 2 concentration and 1.3 per thousand for δC-13, which is worse compared to mass spectrometry performance, but still allowing field applications. With this portable analyser we measured CO 2 flux degassed from rock in an underground tunnel. The obtained carbon isotopic composition agrees with IRMS measurement, and can be used to identify the carbon source. (authors)

  10. Chemometrical characterization of four italian rice varieties based on genetic and chemical analyses.

    Science.gov (United States)

    Brandolini, Vincenzo; Coïsson, Jean Daniel; Tedeschi, Paola; Barile, Daniela; Cereti, Elisabetta; Maietti, Annalisa; Vecchiati, Giorgio; Martelli, Aldo; Arlorio, Marco

    2006-12-27

    This paper describes a method for achieving qualitative identification of four rice varieties from two different Italian regions. To estimate the presence of genetic diversity among the four rice varieties, we used polymerase chain reaction-randomly amplified polymorphic DNA (PCR-RAPD) markers, and to elucidate whether a relationship exists between the ground and the specific characteristics of the product, we studied proximate composition, fatty acid composition, mineral content, and total antioxidant capacity. Using principal component analysis on genomic and compositional data, we were able to classify rice samples according to their variety and their district of production. This work also examined the discrimination ability of different parameters. It was found that genomic data give the best discrimination based on varieties, indicating that RAPD assays could be useful in discriminating among closely related species, while compositional analyses do not depend on the genetic characters only but are related to the production area.

  11. PC based uranium enrichment analyser

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishana, K.R.; Bairi, B.R.

    1991-01-01

    It is important to measure enrichment of unirradiated nuclear fuel elements during production as a quality control measure. An IBM PC based system has recently been tested for enrichment measurements for Nuclear Fuel Complex (NFC), Hyderabad. As required by NFC, the system has ease of calibration. It is easy to switch the system from measuring enrichment of fuel elements to pellets and also automatically store the data and the results. The system uses an IBM PC plug in card to acquire data. The card incorporates programmable interval timers (8253-5). The counter/timer devices are executed by I/O mapped I/O's. A novel algorithm has been incorporated to make the system more reliable. The application software has been written in BASIC. (author). 9 refs., 1 fig

  12. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    Science.gov (United States)

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  13. Structural changes in Parkinson's disease: voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake.

    Science.gov (United States)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Yamaguchi, Hiroo; Kira, Jun-Ichi; Honda, Hiroshi

    2017-12-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p  90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p based morphometry can detect grey matter changes in Parkinson's disease. • Diffusion tensor imaging can detect white matter changes in Parkinson's disease.

  14. Photosynthesis in a different light: Spectro-microscopy for in vivo characterisation of chloroplasts

    Directory of Open Access Journals (Sweden)

    Sébastien ePeter

    2014-06-01

    Full Text Available During photosynthesis, energy conversion at the two photosystems is controlled by highly complex and dynamic adaptation processes triggered by external factors such as light quality, intensity, and duration, or internal cues such as carbon availability. These dynamics have remained largely concealed so far, because current analytical techniques are based on the investigation of isolated chloroplasts lacking full adaptation ability and are performed at non-physiologically low temperatures. Here, we use non-invasive in planta spectro-microscopic approaches to investigate living chloroplasts in their native environment at ambient temperatures. This is a valuable approach to study the complex function of these systems, because an intrinsic property – the fluorescence emission – is exploited and no additional external perturbations are introduced. Our analysis demonstrates a dynamic adjustment of not only the photosystemI/photosystemII (PSI/PSII intensity ratio in the chloroplasts but also of the capacity of the LHCs for energy transfer in response to environmental and internal cues.

  15. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.

    2010-01-01

    of long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser...

  16. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  17. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  18. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    Directory of Open Access Journals (Sweden)

    Zhigang Zuo

    2014-04-01

    Full Text Available It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were discussed. A model with casing diameter of 0.875D40 was selected for further analyses. FEM analyses were then carried out on different combinations of casing sizes, casing wall thickness, and materials, to evaluate its safety related to pressure integrity, with respect to both static and fatigue strength analyses. Two models with forging and cast materials were selected as final results.

  19. [Clinical research XXIII. From clinical judgment to meta-analyses].

    Science.gov (United States)

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  20. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  1. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2013-03-01

    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  2. A protocol for analysing mathematics teacher educators' practices

    OpenAIRE

    Kuzle , Ana; Biehler , Rolf

    2015-01-01

    International audience; Studying practices in a teaching-learning environment, such as professional development programmes, is a complex and multi-faceted endeavour. While several frameworks exist to help researchers analyse teaching practices, none exist to analyse practices of those who organize professional development programmes, namely mathematics teacher educators. In this paper, based on theoretical as well as empirical results, we present a protocol for capturing different aspects of ...

  3. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  4. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  5. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    Directory of Open Access Journals (Sweden)

    Han Bossier

    2018-01-01

    Full Text Available Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1 the balance between false and true positives and (2 the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS, or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35. To do this, we apply a resampling scheme on a large dataset (N = 1,400 to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results.

  6. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  7. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  8. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Dincer, Ibrahim; Rosen, Marc A.

    2011-01-01

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO 2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO 2 emissions of the overall plant. The environmental impact in terms of CO 2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO 2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  9. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  10. Neutronics-processing interface analyses for the Accelerator Transmutation of Waste (ATW) aqueous-based blanket system

    International Nuclear Information System (INIS)

    Davidson, J.W.; Battat, M.E.

    1993-01-01

    Neutronics-processing interface parameters have large impacts on the neutron economy and transmutation performance of an aqueous-based Accelerator Transmutation of Waste (ATW) system. A detailed assessment of the interdependence of these blanket neutronic and chemical processing parameters has been performed. Neutronic performance analyses require that neutron transport calculations for the ATW blanket systems be fully coupled with the blanket processing and include all neutron absorptions in candidate waste nuclides as well as in fission and transmutation products. The effects of processing rates, flux levels, flux spectra, and external-to-blanket inventories on blanket neutronic performance were determined. In addition, the inventories and isotopics in the various subsystems were also calculated for various actinide and long-lived fission product transmutation strategies

  11. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  12. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  13. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2011-01-01

    Full Text Available Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong are presented.

  14. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    Science.gov (United States)

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  15. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Melkerud, Per-Arne; Bain, Derek C.; Olsson, Mats T.

    2003-01-01

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO 3 , and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmol c m -2 yr -1 , amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  16. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  17. [Research on fast classification based on LIBS technology and principle component analyses].

    Science.gov (United States)

    Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng

    2014-11-01

    Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.

  18. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  19. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  20. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  1. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  2. Evaluation of an optoacoustic based gas analysing device

    Science.gov (United States)

    Markmann, Janine; Lange, Birgit; Theisen-Kunde, Dirk; Danicke, Veit; Mayorov, Fedor; Eckert, Sebastian; Kettmann, Pascal; Brinkmann, Ralf

    2017-07-01

    The relative occurrence of volatile organic compounds in the human respiratory gas is disease-specific (ppb range). A prototype of a gas analysing device using two tuneable laser systems, an OPO-laser (2.5 to 10 μm) and a CO2-laser (9 to 11 μm), and an optoacoustic measurement cell was developed to detect concentrations in the ppb range. The sensitivity and resolution of the system was determined by test gas measurements, measuring ethylene and sulfur hexafluoride with the CO2-laser and butane with the OPO-laser. System sensitivity found to be 13 ppb for sulfur hexafluoride, 17 ppb for ethylene and Respiratory gas samples of 8 healthy volunteers were investigated by irradiation with 17 laser lines of the CO2-laser. Several of those lines overlap with strong absorption bands of ammonia. As it is known that ammonia concentration increases by age a separation of people 35 was striven for. To evaluate the data the first seven gas samples were used to train a discriminant analysis algorithm. The eighth subject was then assigned correctly to the group >35 years with the age of 49 years.

  3. Analysing Trust Transitivity and The Effects of Unknown Dependence

    Directory of Open Access Journals (Sweden)

    Touhid Bhuiyan

    2010-03-01

    Full Text Available Trust can be used to improve online automated recommendation within a given domain. Trust transitivity is used to make it successful. But trust transitivity has different interpretations. Trust and trust transitivity; both are the human mental phenomenon and for this reason, there is no such thing as objective transitivity. Trust transitivity and trust fusion both are important elements in computational trust. This paper analyses the parameter dependence problem in trust transitivity and proposes some definitions considering the effects of base rate. In addition, it also proposes belief functions based on subjective logic to analyse trust transitivity of three specified cases with sensitive and insensitive based rate. Then it presents a quantitative analysis of the effects of unknown dependence problem in an interconnected network environment; such Internet.

  4. Candelariella placodizans (Candelariaceae reported new to mainland China and Taiwan based on morphological, chemical and molecular phylogenetic analyses

    Directory of Open Access Journals (Sweden)

    Lidia Yakovchenko

    2016-06-01

    Full Text Available Candelariella placodizans is newly reported from China. It was collected on exposed rocks with mosses on the alpine areas of Taiwan and Yunnan Province, China at elevation between 3200-4400 m. Molecular phylogenetic analyses based on ITS rDNA sequences were also performed to confirm the monophyly of the Chinese populations with respect to already existing sequences of the species, and then further to examine their relationships to other members of the genus. An identification key to all 14 known taxa of Candelariella in China is provided.

  5. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  6. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  7. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  8. Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.

    2017-10-01

    In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.

  9. Homeopathy: meta-analyses of pooled clinical data.

    Science.gov (United States)

    Hahn, Robert G

    2013-01-01

    In the first decade of the evidence-based era, which began in the mid-1990s, meta-analyses were used to scrutinize homeopathy for evidence of beneficial effects in medical conditions. In this review, meta-analyses including pooled data from placebo-controlled clinical trials of homeopathy and the aftermath in the form of debate articles were analyzed. In 1997 Klaus Linde and co-workers identified 89 clinical trials that showed an overall odds ratio of 2.45 in favor of homeopathy over placebo. There was a trend toward smaller benefit from studies of the highest quality, but the 10 trials with the highest Jadad score still showed homeopathy had a statistically significant effect. These results challenged academics to perform alternative analyses that, to demonstrate the lack of effect, relied on extensive exclusion of studies, often to the degree that conclusions were based on only 5-10% of the material, or on virtual data. The ultimate argument against homeopathy is the 'funnel plot' published by Aijing Shang's research group in 2005. However, the funnel plot is flawed when applied to a mixture of diseases, because studies with expected strong treatments effects are, for ethical reasons, powered lower than studies with expected weak or unclear treatment effects. To conclude that homeopathy lacks clinical effect, more than 90% of the available clinical trials had to be disregarded. Alternatively, flawed statistical methods had to be applied. Future meta-analyses should focus on the use of homeopathy in specific diseases or groups of diseases instead of pooling data from all clinical trials. © 2013 S. Karger GmbH, Freiburg.

  10. Model-based performance and energy analyses of reverse osmosis to reuse wastewater in a PVC production site.

    Science.gov (United States)

    Hu, Kang; Fiedler, Thorsten; Blanco, Laura; Geissen, Sven-Uwe; Zander, Simon; Prieto, David; Blanco, Angeles; Negro, Carlos; Swinnen, Nathalie

    2017-11-10

    A pilot-scale reverse osmosis (RO) followed behind a membrane bioreactor (MBR) was developed for the desalination to reuse wastewater in a PVC production site. The solution-diffusion-film model (SDFM) based on the solution-diffusion model (SDM) and the film theory was proposed to describe rejections of electrolyte mixtures in the MBR effluent which consists of dominant ions (Na + and Cl - ) and several trace ions (Ca 2+ , Mg 2+ , K + and SO 4 2- ). The universal global optimisation method was used to estimate the ion permeability coefficients (B) and mass transfer coefficients (K) in SDFM. Then, the membrane performance was evaluated based on the estimated parameters which demonstrated that the theoretical simulations were in line with the experimental results for the dominant ions. Moreover, an energy analysis model with the consideration of limitation imposed by the thermodynamic restriction was proposed to analyse the specific energy consumption of the pilot-scale RO system in various scenarios.

  11. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  12. The development of an on-line gold analyser

    International Nuclear Information System (INIS)

    Robert, R.V.D.; Ormrod, G.T.W.

    1982-01-01

    An on-line analyser to monitor the gold in solutions from the carbon-in-pulp process is described. The automatic system is based on the delivery of filtered samples of the solutions to a distribution valve for measurement by flameless atomic-absorption spectrophotometry. The samples is introduced by the aerosol-deposition method. Operation of the analyser on a pilot plant and on a full-scale carbon-in-pulp plant has shown that the system is economically feasible and capable of providing a continuous indication of the efficiency of the extraction process

  13. Voxel-based morphometry analyses of in-vivo MRI in the aging mouse lemur primate

    Directory of Open Access Journals (Sweden)

    Stephen John Sawiak

    2014-05-01

    Full Text Available Cerebral atrophy is one of the most widely brain alterations associated to aging. A clear relationship has been established between age-associated cognitive impairments and cerebral atrophy. The mouse lemur (Microcebus murinus is a small primate used as a model of age-related neurodegenerative processes. It is the first nonhuman primate in which cerebral atrophy has been correlated with cognitive deficits. Previous studies of cerebral atrophy in this model were based on time consuming manual delineation or measurement of selected brain regions from magnetic resonance images (MRI. These measures could not be used to analyse regions that cannot be easily outlined such as the nucleus basalis of Meynert or the subiculum. In humans, morphometric assessment of structural changes with age is generally performed with automated procedures such as voxel-based morphometry (VBM. The objective of our work was to perform user-independent assessment of age-related morphological changes in the whole brain of large mouse lemur populations thanks to VBM. The study was based on the SPMMouse toolbox of SPM 8 and involved thirty mouse lemurs aged from 1.9 to 11.3 years. The automatic method revealed for the first time atrophy in regions where manual delineation is prohibitive (nucleus basalis of Meynert, subiculum, prepiriform cortex, Brodmann areas 13-16, hypothalamus, putamen, thalamus, corpus callosum. Some of these regions are described as particularly sensitive to age-associated alterations in humans. The method revealed also age-associated atrophy in cortical regions (cingulate, occipital, parietal, nucleus septalis, and the caudate. Manual measures performed in some of these regions were in good agreement with results from automatic measures. The templates generated in this study as well as the toolbox for SPM8 can be downloaded. These tools will be valuable for future evaluation of various treatments that are tested to modulate cerebral aging in lemurs.

  14. Development of ITER 3D neutronics model and nuclear analyses

    International Nuclear Information System (INIS)

    Zeng, Q.; Zheng, S.; Lu, L.; Li, Y.; Ding, A.; Hu, H.; Wu, Y.

    2007-01-01

    ITER nuclear analyses rely on the calculations with the three-dimensional (3D) Monte Carlo code e.g. the widely-used MCNP. However, continuous changes in the design of the components require the 3D neutronics model for nuclear analyses should be updated. Nevertheless, the modeling of a complex geometry with MCNP by hand is a very time-consuming task. It is an efficient way to develop CAD-based interface code for automatic conversion from CAD models to MCNP input files. Based on the latest CAD model and the available interface codes, the two approaches of updating 3D nuetronics model have been discussed by ITER IT (International Team): The first is to start with the existing MCNP model 'Brand' and update it through a combination of direct modification of the MCNP input file and generation of models for some components directly from the CAD data; The second is to start from the full CAD model, make the necessary simplifications, and generate the MCNP model by one of the interface codes. MCAM as an advanced CAD-based MCNP interface code developed by FDS Team in China has been successfully applied to update the ITER 3D neutronics model by adopting the above two approaches. The Brand model has been updated to generate portions of the geometry based on the newest CAD model by MCAM. MCAM has also successfully performed conversion to MCNP neutronics model from a full ITER CAD model which is simplified and issued by ITER IT to benchmark the above interface codes. Based on the two updated 3D neutronics models, the related nuclear analyses are performed. This paper presents the status of ITER 3D modeling by using MCAM and its nuclear analyses, as well as a brief introduction of advanced version of MCAM. (authors)

  15. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  16. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  17. Implication of the cause of differences in 3D structures of proteins with high sequence identity based on analyses of amino acid sequences and 3D structures.

    Science.gov (United States)

    Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi

    2014-09-18

    Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.

  18. Spectromicroscopic Insights into the Morphology and Interfaces of Operational Organic Electronic Devices

    OpenAIRE

    Du, Xiaoyan

    2017-01-01

    Organic electronics, e.g., organic field-effect transistors (OFETs), organic solar cells (OSCs) and organic light-emitting diodes (OLEDs), have attracted strong interest in both academia and industry during the last decades due to their unique capabilities offered by organic semiconductors. The micro-/nano-structures in active layers and the interface engineering in organic electronics are extremely important for desired device functionalities. In this thesis, the structure-function relations...

  19. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  20. Analyses of the soil surface dynamic of South African Kalahari salt pans based on hyperspectral and multitemporal data

    Science.gov (United States)

    Milewski, Robert; Chabrillat, Sabine; Behling, Robert; Mielke, Christian; Schleicher, Anja Maria; Guanter, Luis

    2016-04-01

    The consequences of climate change represent a major threat to sustainable development and growth in Southern Africa. Understanding the impact on the geo- and biosphere is therefore of great importance in this particular region. In this context the Kalahari salt pans (also known as playas or sabkhas) and their peripheral saline and alkaline habitats are an ecosystem of major interest. They are very sensitive to environmental conditions, and as thus hydrological, mineralogical and ecological responses to climatic variations can be analysed. Up to now the soil composition of salt pans in this area have been only assessed mono-temporally and on a coarse regional scale. Furthermore, the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. High spectral resolution remote sensing can estimate evaporite content and mineralogy of soils based on the analyses of the surface reflectance properties within the Visible-Near InfraRed (VNIR 400-1000 nm) and Short-Wave InfraRed (SWIR 1000-2500 nm) regions. In these wavelength regions major chemical components of the soil interact with the electromagnetic radiation and produce characteristic absorption features that can be used to derive the properties of interest. Although such techniques are well established for the laboratory and field scale, the potential of current (Hyperion) and upcoming spaceborne sensors such as EnMAP for quantitative mineralogical and salt spectral mapping is still to be demonstrated. Combined with hyperspectral methods, multitemporal remote sensing techniques allow us to derive the recent dynamic of these salt pans and link the mineralogical analysis of the pan surface to major physical processes in these dryland environments. In this study we focus on the analyses of the Namibian Omongwa salt pans based on satellite hyperspectral imagery and multispectral time-series data. First, a change detection analysis is applied using the Iterative

  1. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    for NPP V-1 Bohunice and on review of the impact of the modelling of selected components to the results of calculation safety analysis (a sensitivity study for NPP Mochovce). In 2001 UJD joined a new European project Alternative Approaches to the Safety Performance Indicators. The project is aimed at the information collecting and determining of approaches and recommendations for implementation of the risk oriented indicators, identification of the impact of the safety culture level and organisational culture on safety and applying of indicators to the needs of regulators and operators. In frame of the PHARE project UJD participated in the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213 units. The main results of the analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated within the project. Possible severe accident mitigation and preventative measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. The obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project which makes a part of the 5 th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2001 the ASTEC computer code was tested on different platforms. The results of the testing are summarised in the technical report of EC issued in September 2001. Further activities within this project were focused on performing of selected accident scenarios analyses and comparison of the obtained results with the analyses realised with the help of other computer codes. The work on the project will continue in 2002. In 2001 a groundwork on establishing the Centre for Nuclear Safety in Central and Eastern Europe (CENS), the seat of which is going to be in Bratislava, has continued. The

  2. How often do sensitivity analyses for economic parameters change cost-utility analysis conclusions?

    Science.gov (United States)

    Schackman, Bruce R; Gold, Heather Taffet; Stone, Patricia W; Neumann, Peter J

    2004-01-01

    There is limited evidence about the extent to which sensitivity analysis has been used in the cost-effectiveness literature. Sensitivity analyses for health-related QOL (HR-QOL), cost and discount rate economic parameters are of particular interest because they measure the effects of methodological and estimation uncertainties. To investigate the use of sensitivity analyses in the pharmaceutical cost-utility literature in order to test whether a change in economic parameters could result in a different conclusion regarding the cost effectiveness of the intervention analysed. Cost-utility analyses of pharmaceuticals identified in a prior comprehensive audit (70 articles) were reviewed and further audited. For each base case for which sensitivity analyses were reported (n = 122), up to two sensitivity analyses for HR-QOL (n = 133), cost (n = 99), and discount rate (n = 128) were examined. Article mentions of thresholds for acceptable cost-utility ratios were recorded (total 36). Cost-utility ratios were denominated in US dollars for the year reported in each of the original articles in order to determine whether a different conclusion would have been indicated at the time the article was published. Quality ratings from the original audit for articles where sensitivity analysis results crossed the cost-utility ratio threshold above the base-case result were compared with those that did not. The most frequently mentioned cost-utility thresholds were $US20,000/QALY, $US50,000/QALY, and $US100,000/QALY. The proportions of sensitivity analyses reporting quantitative results that crossed the threshold above the base-case results (or where the sensitivity analysis result was dominated) were 31% for HR-QOL sensitivity analyses, 20% for cost-sensitivity analyses, and 15% for discount-rate sensitivity analyses. Almost half of the discount-rate sensitivity analyses did not report quantitative results. Articles that reported sensitivity analyses where results crossed the cost

  3. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.; Haroon, Mohamed; Ngugi, David; Thompson, Luke R.; Stingl, Ulrich

    2016-01-01

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  4. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.

    2016-06-25

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  5. A two-channel wave analyser for sounding rockets and satellites

    International Nuclear Information System (INIS)

    Brondz, E.

    1989-04-01

    Studies of low frequency electromagnetic waves, produced originally by lightning discharges penetrating the ionosphere, provide an important source of valuable information about the earth's surrounding plasma. Use of rockets and satellites supported by ground-based observations implies, unique opportunity for measuring in situ a number of parameters simultaneously in order to correlate data from various measurements. However, every rocket experiment has to be designed bearing in mind telemetry limitations and/or short flight duration. Typical flight duration for Norwegian rockets launched from Andoeya Rocket Range is 500 to 600 s. Therefore, the most desired way to use a rocket or satellite is to carry out data analyses on board in real time. Recent achievements in Digital Signal Processing (DSP) technology have made it possible to undertake very complex on board data manipulation. As a part of rocket instrumentation, a DSP based unit able to carry out on board analyses of low frequency electromagnetic waves in the ionosphere has been designed. The unit can be seen as a general purpose computer built on the basis of a fixed-point 16 bit signal processor. The unit is supplied with a program code in order to perform wave analyses on two independent channels simultaneously. The analyser is able to perform 256 point complex fast fourier transformations, and it produce a spectral power desity estimate on both channels every 85 ms. The design and construction of the DSP based unit is described and results from the tests are presented

  6. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  7. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Localisation of nursery areas based on comparative analyses of the horizontal and vertical distribution patterns of juvenile Baltic cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Lundgren, Bo; Kristensen, Kasper

    2013-01-01

    Baltic cod are determined, and their nursery areas are localised according to the environmental factors affecting them. Comparative statistical analyses of biological, hydrographic and hydroacoustic data are carried out based on standard ICES demersal trawl surveys and special integrated trawl...... and acoustic research surveys. Horizontal distribution maps for the 2001–2010 cohorts of juvenile cod are further generated by applying a statistical log-Gaussian Cox process model to the standard trawl survey data. The analyses indicate size-dependent horizontal and distinct vertical and diurnal distribution...... in deep sea localities down to a 100 m depth and at oxygen concentrations between 2–4 ml O2.l−1. The vertical, diurnally stratified and repeated trawling and hydroacoustic target strength-depth distributions obtained from the special surveys show juvenile cod concentrations in frontal zone water layers...

  9. Formalisation des bases méthodologiques et conceptuelles d'une analyse spatiale des accidents de la route

    Directory of Open Access Journals (Sweden)

    Florence Huguenin Richard

    1999-06-01

    Full Text Available Cet article pose les bases méthodologiques et conceptuelles d’une analyse spatiale du risque routier. L’étude de ce phénomène requiert une masse importante de données qui décrivent différentes dimensions de l’accident et qui peuvent être gérées dans un système d’information géographique. Elle demande aussi une réflexion méthodologique sur la cartographie du risque, les échelles d’observation, l’agrégation de données qualitatives et quantitatives, l’utilisation de méthodes statistiques adaptées au risque routier et l'intégration de l’espace comme facteur d’insécurité.

  10. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  11. Mechanical analyses on the digital behaviour of the Tokay gecko (Gekko gecko) based on a multi-level directional adhesion model

    OpenAIRE

    Wu, Xuan; Wang, Xiaojie; Mei, Tao; Sun, Shaoming

    2015-01-01

    This paper proposes a multi-level hierarchical model for the Tokay gecko (Gekko gecko) adhesive system and analyses the digital behaviour of the G. gecko under macro/meso-level scale. The model describes the structures of G. gecko's adhesive system from the nano-level spatulae to the sub-millimetre-level lamella. The G. gecko's seta is modelled using inextensible fibril based on Euler's elastica theorem. Considering the side contact of the spatular pads of the seta on the flat and rigid subst...

  12. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  13. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette

    2016-01-01

    PURPOSE: An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream...

  14. Direct view at colossal permittivity in donor-acceptor (Nb, In) co-doped rutile TiO2

    International Nuclear Information System (INIS)

    Mandal, Suman; Pal, Somnath; Hazarika, Abhijit; Kundu, Asish K.; Menon, Krishnakumar S. R.; Rioult, Maxime; Belkhou, Rachid

    2016-01-01

    Topical observations of colossal permittivity (CP) with low dielectric loss in donor-acceptor cations co-doped rutile TiO 2 have opened up several possibilities in microelectronics and energy-storage devices. Yet, the precise origin of the CP behavior, knowledge of which is essential to empower the device integration suitably, is highly disputed in the literature. From spectromicroscopic approach besides dielectric measurements, we explore that microscopic electronic inhomogeneities along with the nano-scale phase boundaries and the low temperature polaronic relaxation are mostly responsible for such a dielectric behavior, rather than electron-pinned defect-dipoles/grain-boundary effects as usually proposed. Donor-acceptor co-doping results in a controlled carrier-hopping inevitably influencing the dielectric loss while invariably upholding the CP value.

  15. Direct view at colossal permittivity in donor-acceptor (Nb, In) co-doped rutile TiO{sub 2}

    Energy Technology Data Exchange (ETDEWEB)

    Mandal, Suman, E-mail: suman.mandal@sscu.iisc.ernet.in; Pal, Somnath; Hazarika, Abhijit [Solid State and Structural Chemistry Unit, Indian Institute of Science, Bengaluru 560012 (India); Kundu, Asish K.; Menon, Krishnakumar S. R. [Surface Physics and Material Science Division, Saha Institute of Nuclear Physics, Kolkata 700064 (India); Rioult, Maxime; Belkhou, Rachid [Synchrotron SOLEIL, L' Orme des Merisiers Saint-Aubin, 91192 Gif-sur-Yvette (France)

    2016-08-29

    Topical observations of colossal permittivity (CP) with low dielectric loss in donor-acceptor cations co-doped rutile TiO{sub 2} have opened up several possibilities in microelectronics and energy-storage devices. Yet, the precise origin of the CP behavior, knowledge of which is essential to empower the device integration suitably, is highly disputed in the literature. From spectromicroscopic approach besides dielectric measurements, we explore that microscopic electronic inhomogeneities along with the nano-scale phase boundaries and the low temperature polaronic relaxation are mostly responsible for such a dielectric behavior, rather than electron-pinned defect-dipoles/grain-boundary effects as usually proposed. Donor-acceptor co-doping results in a controlled carrier-hopping inevitably influencing the dielectric loss while invariably upholding the CP value.

  16. Prediction of Seismic Slope Displacements by Dynamic Stick-Slip Analyses

    International Nuclear Information System (INIS)

    Ausilio, Ernesto; Costanzo, Antonio; Silvestri, Francesco; Tropeano, Giuseppe

    2008-01-01

    A good-working balance between simplicity and reliability in assessing seismic slope stability is represented by displacement-based methods, in which the effects of deformability and ductility can be either decoupled or coupled in the dynamic analyses. In this paper, a 1D lumped mass ''stick-slip'' model is developed, accounting for soil heterogeneity and non-linear behaviour, with a base sliding mechanism at a potential rupture surface. The results of the preliminary calibration show a good agreement with frequency-domain site response analysis in no-slip conditions. The comparison with rigid sliding block analyses and with the decoupled approach proves that the stick-slip procedure can result increasingly unconservative for soft soils and deep sliding depths

  17. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  18. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  19. Experimental technique of stress analyses by neutron diffraction

    International Nuclear Information System (INIS)

    Sun, Guangai; Chen, Bo; Huang, Chaoqiang

    2009-09-01

    The structures and main components of neutron diffraction stress analyses spectrometer, SALSA, as well as functions and parameters of each components are presented. The technical characteristic and structure parameters of SALSA are described. Based on these aspects, the choice of gauge volume, method of positioning sample, determination of diffraction plane and measurement of zero stress do are discussed. Combined with the practical experiments, the basic experimental measurement and the related settings are introduced, including the adjustments of components, pattern scattering, data recording and checking etc. The above can be an instruction for stress analyses experiments by neutron diffraction and neutron stress spectrometer construction. (authors)

  20. Molecular systematics of Indian Alysicarpus (Fabaceae) based on analyses of nuclear ribosomal DNA sequences.

    Science.gov (United States)

    Gholami, Akram; Subramaniam, Shweta; Geeta, R; Pandey, Arun K

    2017-06-01

    Alysicarpus Necker ex Desvaux (Fabaceae, Desmodieae) consists of ~30 species that are distributed in tropical and subtropical regions of theworld. In India, the genus is represented by ca. 18 species, ofwhich seven are endemic. Sequences of the nuclear Internal transcribed spacer from38 accessions representing 16 Indian specieswere subjected to phylogenetic analyses. The ITS sequence data strongly support the monophyly of the genus Alysicarpus. Analyses revealed four major well-supported clades within Alysicarpus. Ancestral state reconstructions were done for two morphological characters, namely calyx length in relation to pod (macrocalyx and microcalyx) and pod surface ornamentation (transversely rugose and nonrugose). The present study is the first report on molecular systematics of Indian Alysicarpus.

  1. Systematic review of model-based analyses reporting the cost-effectiveness and cost-utility of cardiovascular disease management programs.

    Science.gov (United States)

    Maru, Shoko; Byrnes, Joshua; Whitty, Jennifer A; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A

    2015-02-01

    The reported cost effectiveness of cardiovascular disease management programs (CVD-MPs) is highly variable, potentially leading to different funding decisions. This systematic review evaluates published modeled analyses to compare study methods and quality. Articles were included if an incremental cost-effectiveness ratio (ICER) or cost-utility ratio (ICUR) was reported, it is a multi-component intervention designed to manage or prevent a cardiovascular disease condition, and it addressed all domains specified in the American Heart Association Taxonomy for Disease Management. Nine articles (reporting 10 clinical outcomes) were included. Eight cost-utility and two cost-effectiveness analyses targeted hypertension (n=4), coronary heart disease (n=2), coronary heart disease plus stoke (n=1), heart failure (n=2) and hyperlipidemia (n=1). Study perspectives included the healthcare system (n=5), societal and fund holders (n=1), a third party payer (n=3), or was not explicitly stated (n=1). All analyses were modeled based on interventions of one to two years' duration. Time horizon ranged from two years (n=1), 10 years (n=1) and lifetime (n=8). Model structures included Markov model (n=8), 'decision analytic models' (n=1), or was not explicitly stated (n=1). Considerable variation was observed in clinical and economic assumptions and reporting practices. Of all ICERs/ICURs reported, including those of subgroups (n=16), four were above a US$50,000 acceptability threshold, six were below and six were dominant. The majority of CVD-MPs was reported to have favorable economic outcomes, but 25% were at unacceptably high cost for the outcomes. Use of standardized reporting tools should increase transparency and inform what drives the cost-effectiveness of CVD-MPs. © The European Society of Cardiology 2014.

  2. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  3. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Weber, T., E-mail: thomas.weber@physik.uni-erlangen.de [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Bartl, P.; Durst, J. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Haas, W. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); University of Erlangen-Nuremberg, Pattern Recognition Lab, Martensstr. 3, 91058 Erlangen (Germany); Michel, T.; Ritter, A.; Anton, G. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany)

    2011-08-21

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary. With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases. These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool 'SPHINX', combining both wave and particle contributions of the simulated photons. The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant. Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements. This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  4. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  5. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  6. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  7. Multi-person and multi-attribute design evaluations using evidential reasoning based on subjective safety and cost analyses

    International Nuclear Information System (INIS)

    Wang, J.; Yang, J.B.; Sen, P.

    1996-01-01

    This paper presents an approach for ranking proposed design options based on subjective safety and cost analyses. Hierarchical system safety analysis is carried out using fuzzy sets and evidential reasoning. This involves safety modelling by fuzzy sets at the bottom level of a hierarchy and safety synthesis by evidential reasoning at higher levels. Fuzzy sets are also used to model the cost incurred for each design option. An evidential reasoning approach is then employed to synthesise the estimates of safety and cost, which are made by multiple designers. The developed approach is capable of dealing with problems of multiple designers, multiple attributes and multiple design options to select the best design. Finally, a practical engineering example is presented to demonstrate the proposed multi-person and multi-attribute design selection approach

  8. Population-based cost-offset analyses for disorder-specific treatment of anorexia nervosa and bulimia nervosa in Germany.

    Science.gov (United States)

    Bode, Katharina; Götz von Olenhusen, Nina Maria; Wunsch, Eva-Maria; Kliem, Sören; Kröger, Christoph

    2017-03-01

    Previous research has shown that anorexia nervosa (AN) and bulimia nervosa (BN) are expensive illnesses to treat. To reduce their economic burden, adequate interventions need to be established. Our objective was to conduct cost-offset analyses for evidence-based treatment of eating disorders using outcome data from a psychotherapy trial involving cognitive behavioral therapy (CBT) and focal psychodynamic therapy (FPT) for AN and a trial involving CBT for BN. Assuming a currently running, ideal healthcare system using a 12-month, prevalence-based approach and varying the willingness to participate in treatment, we investigated whether the potential financial benefits of AN- and BN-related treatment outweigh the therapy costs at the population level. We elaborated on a formula that allows calculating cost-benefit relationships whereby the calculation of the parameters is based on estimates from data of health institutions within the German healthcare system. Additional intangible benefits were calculated with the aid of Quality-Adjusted Life Years. The annual costs of an untreated eating disorder were 2.38 billion EUR for AN and 617.69 million EUR for BN. Independent of the willingness to participate in treatment, the cost-benefit relationships for the treatment remained constant at 2.51 (CBT) and 2.33 (FPT) for AN and 4.05 (CBT) for BN. This consistency implies that for each EUR invested in the treatment, between 2.33 and 4.05 EUR could be saved each year. Our findings suggest that the implementation of evidence-based psychotherapy treatments for AN and BN may achieve substantial cost savings at the population level. © 2017 Wiley Periodicals, Inc.

  9. Material analyses of foam-based SiC FCI after dynamic testing in PbLi in MaPLE loop at UCLA

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Maria, E-mail: maria.gonzalez@ciemat.es [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Rapisarda, David; Ibarra, Angel [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Courtessole, Cyril; Smolentsev, Sergey; Abdou, Mohamed [Fusion Science and Technology Center, UCLA (United States)

    2016-11-01

    Highlights: • Samples from foam-based SiC FCI were analyzed by looking at their SEM microstructure and elemental composition. • After finishing dynamic experiments in the flowing hot PbLi, the liquid metal ingress has been confirmed due to infiltration through local defects in the protective inner CVD layer. • No direct evidences of corrosion/erosion were observed; these defects could be related to the manufacturing process. - Abstract: Foam-based SiC flow channel inserts (FCIs) developed and manufactured by Ultramet, USA are currently under testing in the flowing hot lead-lithium (PbLi) alloy in the MaPLE loop at UCLA to address chemical/physical compatibility and to access the MHD pressure drop reduction. UCLA has finished the first experimental series, where a single uninterrupted long-term (∼6500 h) test was performed on a 30-cm FCI segment in a magnetic field up to 1.8 T at the temperature of 300 °C and maximum flow velocities of ∼ 15 cm/s. After finishing the experiments, the FCI sample was extracted from the host stainless steel duct and cut into slices. Few of them have been analyzed at CIEMAT as a part of the joint collaborative effort on the development of the DCLL blanket concept in the EU and the US. The initial inspection of the slices using optical microscopic analysis at UCLA showed significant PbLi ingress into the bulk FCI material that resulted in degradation of insulating properties of the FCI. Current material analyses at CIEMAT are based on advanced techniques, including characterization of FCI samples by FESEM to study PbLi ingress, imaging of cross sections, composition analysis by EDX and crack inspection. These analyses suggest that the ingress was caused by local defects in the protective inner CVD layer that might be originally present in the FCI or occurred during testing.

  10. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between ne...... information such as boundedness properties and liveness properties. The functionality of the simulation engine and state space facilities are similar to the corresponding components in Design/CPN, which is a widespread tool for Coloured Petri Nets.......CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  11. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.

    2013-01-01

    Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551

  12. Post-facta Analyses of Fukushima Accident and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Fumiya [Sociotechnical Systems Safety Research Institute, Ichige (Japan)

    2014-08-15

    Independent analyses have been performed of the core melt behavior of the Unit 1, Unit 2 and Unit 3 reactors of Fukushima Daiichi Nuclear Power Station on 11-15 March 2011. The analyses are based on a phenomenological methodology with measured data investigation and a simple physical model calculation. Estimated are time variation of core water level, core material temperature and hydrogen generation rate. The analyses have revealed characteristics of accident process of each reactor. In the case of Unit 2 reactor, the calculated result suggests little hydrogen generation because of no steam generation in the core for zirconium-steam reaction during fuel damage process. It could be the reason of no hydrogen explosion in the Unit 2 reactor building. Analyses have been performed also on the core material behavior in another chaotic period of 19-31 March 2011, and it resulted in a re-melt hypothesis that core material in each reactor should have melted again due to shortage of cooling water. The hypothesis is consistent with many observed features of radioactive materials dispersion into the environment.

  13. TAXONOMY AND GENETIC RELATIONSHIPS OF PANGASIIDAE, ASIAN CATFISHES, BASED ON MORPHOLOGICAL AND MOLECULAR ANALYSES

    Directory of Open Access Journals (Sweden)

    Rudhy Gustiano

    2007-12-01

    Full Text Available Pangasiids are economically important riverine catfishes generally residing in freshwater from the Indian subcontinent to the Indonesian Archipelago. The systematics of this family are still poorly known. Consequently, lack of such basic information impedes the understanding of the biology of the Pangasiids and the study of their aquaculture potential as well as improvement of seed production and growth performance. The objectives of the present study are to clarify phylogeny of this family based on a biometric analysis and molecular evidence using 12S ribosomal mtDNA on the total of 1070 specimens. The study revealed that 28 species are recognised as valid in Pangasiidae. Four genera are also recognized as Helicophagus Bleeker 1858, Pangasianodon Chevey 1930, Pteropangasius Fowler 1937, and Pangasius Valenciennes 1840 instead of two as reported by previous workers. The phylogenetic analysis demonstrated the recognised genera, and genetic relationships among taxa. Overall, trees from the different analyses show similar topologies and confirm the hypothesis derived from geological history, palaeontology, and similar models in other taxa of fishes from the same area. The oldest genus may already have existed when the Asian mainland was still connected to the islands in the southern part about 20 million years ago.

  14. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  15. SuperTRI: A new approach based on branch support analyses of multiple independent data sets for assessing reliability of phylogenetic inferences.

    Science.gov (United States)

    Ropiquet, Anne; Li, Blaise; Hassanin, Alexandre

    2009-09-01

    Supermatrix and supertree are two methods for constructing a phylogenetic tree by using multiple data sets. However, these methods are not a panacea, as conflicting signals between data sets can lead to misinterpret the evolutionary history of taxa. In particular, the supermatrix approach is expected to be misleading if the species-tree signal is not dominant after the combination of the data sets. Moreover, most current supertree methods suffer from two limitations: (i) they ignore or misinterpret secondary (non-dominant) phylogenetic signals of the different data sets; and (ii) the logical basis of node robustness measures is unclear. To overcome these limitations, we propose a new approach, called SuperTRI, which is based on the branch support analyses of the independent data sets, and where the reliability of the nodes is assessed using three measures: the supertree Bootstrap percentage and two other values calculated from the separate analyses: the mean branch support (mean Bootstrap percentage or mean posterior probability) and the reproducibility index. The SuperTRI approach is tested on a data matrix including seven genes for 82 taxa of the family Bovidae (Mammalia, Ruminantia), and the results are compared to those found with the supermatrix approach. The phylogenetic analyses of the supermatrix and independent data sets were done using four methods of tree reconstruction: Bayesian inference, maximum likelihood, and unweighted and weighted maximum parsimony. The results indicate, firstly, that the SuperTRI approach shows less sensitivity to the four phylogenetic methods, secondly, that it is more accurate to interpret the relationships among taxa, and thirdly, that interesting conclusions on introgression and radiation can be drawn from the comparisons between SuperTRI and supermatrix analyses.

  16. Analyses of microstructural and elastic properties of porous SOFC cathodes based on focused ion beam tomography

    Science.gov (United States)

    Chen, Zhangwei; Wang, Xin; Giuliani, Finn; Atkinson, Alan

    2015-01-01

    Mechanical properties of porous SOFC electrodes are largely determined by their microstructures. Measurements of the elastic properties and microstructural parameters can be achieved by modelling of the digitally reconstructed 3D volumes based on the real electrode microstructures. However, the reliability of such measurements is greatly dependent on the processing of raw images acquired for reconstruction. In this work, the actual microstructures of La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF) cathodes sintered at an elevated temperature were reconstructed based on dual-beam FIB/SEM tomography. Key microstructural and elastic parameters were estimated and correlated. Analyses of their sensitivity to the grayscale threshold value applied in the image segmentation were performed. The important microstructural parameters included porosity, tortuosity, specific surface area, particle and pore size distributions, and inter-particle neck size distribution, which may have varying extent of effect on the elastic properties simulated from the microstructures using FEM. Results showed that different threshold value range would result in different degree of sensitivity for a specific parameter. The estimated porosity and tortuosity were more sensitive than surface area to volume ratio. Pore and neck size were found to be less sensitive than particle size. Results also showed that the modulus was essentially sensitive to the porosity which was largely controlled by the threshold value.

  17. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  18. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  19. Regional analyses of labor markets and demography: a model based Norwegian example.

    Science.gov (United States)

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  20. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  1. Clinical Research That Matters: Designing Outcome-Based Research for Older Adults to Qualify for Systematic Reviews and Meta-Analyses.

    Science.gov (United States)

    Lee, Jeannie K; Fosnight, Susan M; Estus, Erica L; Evans, Paula J; Pho, Victoria B; Reidt, Shannon; Reist, Jeffrey C; Ruby, Christine M; Sibicky, Stephanie L; Wheeler, Janel B

    2018-01-01

    Though older adults are more sensitive to the effects of medications than their younger counterparts, they are often excluded from manufacturer-based clinical studies. Practice-based research is a practical method to identify medication-related effects in older patients. This research also highlights the role of a pharmacist in improving care in this population. A single study rarely has strong enough evidence to change geriatric practice, unless it is a large-scale, multisite, randomized controlled trial that specifically targets older adults. It is important to design studies that may be used in systematic reviews or meta-analyses that build a stronger evidence base. Recent literature has documented a gap in advanced pharmacist training pertaining to research skills. In this paper, we hope to fill some of the educational gaps related to research in older adults. We define best practices when deciding on the type of study, inclusion and exclusion criteria, design of the intervention, how outcomes are measured, and how results are reported. Well-designed studies increase the pool of available data to further document the important role that pharmacists have in optimizing care of older patients.

  2. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  3. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  4. Phylogenetic analyses of Vitis (Vitaceae) based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids.

    Science.gov (United States)

    Jansen, Robert K; Kaittanis, Charalambos; Saski, Christopher; Lee, Seung-Bum; Tomkins, Jeffrey; Alverson, Andrew J; Daniell, Henry

    2006-04-09

    The Vitaceae (grape) is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade. However, maximum likelihood analyses place

  5. Phylogenetic analyses of Vitis (Vitaceae based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids

    Directory of Open Access Journals (Sweden)

    Alverson Andrew J

    2006-04-01

    Full Text Available Abstract Background The Vitaceae (grape is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. Results The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade

  6. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  7. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  8. X-ray photoelectron emission spectromicroscopic analysis of arborescent lycopsid cell wall composition and Carboniferous coal ball preservation

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, C. Kevin [Department of the Geophysical Sciences, University of Chicago, Chicago, IL 60637 (United States); Abrecht, Mike; Zhou, Dong; Gilbert, P.U.P.A. [Department of Physics, University of Wisconsin, Madison, WI 53706 (United States)

    2010-08-01

    Two alternative processes complicate understanding of the biochemical origins and geochemical alteration of organic matter over geologic time: selective preservation of original biopolymers and in situ generation of new geopolymers. One of the best constrained potential sources of bio- and geochemical information about extinct fossil plants is frequently overlooked. Permineralized anatomically preserved plant fossils allow analysis of individual cell and tissue types that have an original biochemical composition already known from living plants. The original composition of more enigmatic fossils can be constrained by geochemical comparisons to tissues of better understood fossils from the same locality. This strategy is possible using synchrotron-based techniques for submicron-scale imaging with X-rays over a range of frequencies in order to provide information concerning the relative abundance of different organic bonds with X-ray Absorption Near Edge Spectroscopy. In this study, X-ray PhotoElectron Emission spectroMicroscopy (X-PEEM) was used to analyze the tissues of Lepidodendron, one of the lycopsid trees that were canopy dominants of many Pennsylvanian coal swamp forests. Its periderm or bark - the single greatest biomass contributor to many Late Paleozoic coals - is found to have a greater aliphatic content and an overall greater density of organic matter than lignified wood. Because X-PEEM allows simultaneous analysis of organic matter and matrix calcite in fully mineralized fossils, this technique also has great potential for analysis of fossil preservation, including documentation of significant traces of organic matter entrained in the calcite crystal fabric that fills the cell lumens. (author)

  9. The Quality of Cost-Utility Analyses in Orthopedic Trauma.

    Science.gov (United States)

    Nwachukwu, Benedict U; Schairer, William W; O'Dea, Evan; McCormick, Frank; Lane, Joseph M

    2015-08-01

    As health care in the United States transitions toward a value-based model, there is increasing interest in applying cost-effectiveness analysis within orthopedic surgery. Orthopedic trauma care has traditionally underemphasized economic analysis. The goals of this review were to identify US-based cost-utility analysis in orthopedic trauma, to assess the quality of the available evidence, and to identify cost-effective strategies within orthopedic trauma. Based on a review of 971 abstracts, 8 US-based cost-utility analyses evaluating operative strategies in orthopedic trauma were identified. Study findings were recorded, and the Quality of Health Economic Studies (QHES) instrument was used to grade the overall quality. Of the 8 studies included in this review, 4 studies evaluated hip and femur fractures, 3 studies analyzed upper extremity fractures, and 1 study assessed open tibial fracture management. Cost-effective interventions identified in this review include total hip arthroplasty (over hemiarthroplasty) for femoral neck fractures in the active elderly, open reduction and internal fixation (over nonoperative management) for distal radius and scaphoid fractures, limb salvage (over amputation) for complex open tibial fractures, and systems-based interventions to prevent delay in hip fracture surgery. The mean QHES score of the studies was 79.25 (range, 67-89). Overall, there is a paucity of cost-utility analyses in orthopedic trauma; however, the available evidence suggests that certain operative interventions can be cost-effective. The quality of these studies, however, is fair, based on QHES grading. More attention should be paid to evaluating the cost-effectiveness of operative intervention in orthopedic trauma. Copyright 2015, SLACK Incorporated.

  10. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Science.gov (United States)

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  11. Radiation physics and shielding codes and analyses applied to design-assist and safety analyses of CANDUR and ACRTM reactors

    International Nuclear Information System (INIS)

    Aydogdu, K.; Boss, C. R.

    2006-01-01

    heavily on experience and engineering judgement, consistent with the ALARA philosophy. Special care is taken to ensure that the best estimate dose rates are used to the extent possible when applying ALARA. Provisions for safeguards equipment are made throughout the fuel-handling route in CANDU and ACR reactors. For example, the fuel bundle counters rely on the decay gammas from the fission products in spent-fuel bundles to record the number of fuel movements. The International Atomic Energy Agency (IAEA) Safeguards system for CANDU and ACR reactors is based on item (fuel bundle) accounting. It involves a combination of IAEA inspection with containment and surveillance, and continuous unattended monitoring. The spent fuel bundle counter monitors spent fuel bundles as they are transferred from the fuelling machine to the spent fuel bay. The shielding and dose-rate analysis need to be carried out so that the bundle counter functions properly. This paper includes two codes used in criticality safety analyses. Criticality safety is a unique phenomenon and codes that address criticality issues will demand specific validations. However, it is recognised that some of the codes used in radiation physics will also be used in criticality safety assessments. (authors)

  12. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  13. Northern Marshall Islands Radiological Survey: a quality-control program for a radiochemical analyses

    International Nuclear Information System (INIS)

    Jennings, C.D.; Mount, M.E.

    1983-08-01

    More than 16,000 radiochemical analyses were performed on about 5400 samples of soils, vegetation, animals, fish, invertebrates, and water to establish amounts of 90 Sr, 137 Cs, 241 Am, and plutonium isotopes in the Northern Marshall Islands. Three laboratories were contracted by Lawrence Livermore National Laboratory to perform the radiochemical analyses: Environmental Analysis Laboratory (EAL), Richmond, California; Eberline Instrument Corporation (EIC), Albuquerque, New Mexico; and Laboratory of Radiation Ecology (LRE), University of Washington, Seattle, Washington. The analytical precision and accuracy were monitored by regularly including duplicate samples and natural matrix standards in each group of about 100 samples analyzed. Based on the duplicates and standards, over 83% of the radiochemical analyses in this survey were acceptable - 97% of the analyses by EAL, 45% of the analyses by EIC, and 98% of the analyses by LRE

  14. Implementation of analyses based on social media data for marketing purposes in academic and scientific organizations in practice – opportunities and limitations

    Directory of Open Access Journals (Sweden)

    Magdalena Grabarczyk-Tokaj

    2013-12-01

    Full Text Available The article is focused on the issue of practice use of analyses, based on data collected in social media, for institutions’ communication and marketing purposes. The subject is being discussed from the perspective of Digital Darwinism — situation, when development of technologies and new means of communication is significantly faster than growth in the knowledge and digital skills among organizations eager to implement those solutions. To diminish negative consequences of Digital Darwinism institutions can broaden their knowledge with analyses of data from cyber space to optimize operations, and make use of running dialog and cooperation with prosuments to face dynamic changes in trends, technologies and society. Information acquired from social media user generated content can be employed as guidelines in planning, running and evaluating communication and marketing activities. The article presents examples of tools and solutions, that can be implement in practice as a support for actions taken by institutions.

  15. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  16. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    Science.gov (United States)

    2016-09-01

    performance study of these algorithms in the particular problem of analysing backscatter signals from rotating blades. The report is organised as follows...provide further insight into the behaviour of the techniques. Here, the algorithms for MP, OMP, CGP, gOMP and ROMP terminate when 10 atoms are

  17. Phylogenomic analyses data of the avian phylogenomics project

    DEFF Research Database (Denmark)

    Jarvis, Erich D; Mirarab, Siavash; Aberer, Andre J

    2015-01-01

    BACKGROUND: Determining the evolutionary relationships among the major lineages of extant birds has been one of the biggest challenges in systematic biology. To address this challenge, we assembled or collected the genomes of 48 avian species spanning most orders of birds, including all Neognathae...... and two of the five Palaeognathae orders. We used these genomes to construct a genome-scale avian phylogenetic tree and perform comparative genomic analyses. FINDINGS: Here we present the datasets associated with the phylogenomic analyses, which include sequence alignment files consisting of nucleotides......ML algorithm or when using statistical binning with the coalescence-based MP-EST algorithm (which we refer to as MP-EST*). Other data sets, such as the coding sequence of some exons, revealed other properties of genome evolution, namely convergence. CONCLUSIONS: The Avian Phylogenomics Project is the largest...

  18. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences

  19. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  20. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  1. Handbook of methods for risk-based analyses of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Samanta, P.K.; Kim, I.S. [Brookhaven National Lab., Upton, NY (United States); Mankamo, T. [Avaplan Oy, Espoo (Finland); Vesely, W.E. [Science Applications International Corp., Dublin, OH (United States)

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  2. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  3. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  4. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    Science.gov (United States)

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  5. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  6. Shock Transmission Analyses of a Simplified Frigate Compartment Using LS-DYNA

    National Research Council Canada - National Science Library

    Trouwborst, W

    1999-01-01

    This report gives results as obtained with finite element analyses using the explicit finite element program LS-DYNA for a longitudinal slice of a frigate's compartment loaded with a shock pulse based...

  7. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0.

    Science.gov (United States)

    Cong, Jing; Liu, Xueduan; Lu, Hui; Xu, Han; Li, Yide; Deng, Ye; Li, Diqiang; Zhang, Yuguang

    2015-09-01

    To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171).

  8. Free Vibration Analyses of FGM Thin Plates by Isogeometric Analysis Based on Classical Plate Theory and Physical Neutral Surface

    Directory of Open Access Journals (Sweden)

    Shuohui Yin

    2013-01-01

    Full Text Available The isogeometric analysis with nonuniform rational B-spline (NURBS based on the classical plate theory (CPT is developed for free vibration analyses of functionally graded material (FGM thin plates. The objective of this work is to provide an efficient and accurate numerical simulation approach for the nonhomogeneous thin plates and shells. Higher order basis functions can be easily obtained in IGA, thus the formulation of CPT based on the IGA can be simplified. For the FGM thin plates, material property gradient in the thickness direction is unsymmetrical about the midplane, so effects of midplane displacements cannot be ignored, whereas the CPT neglects midplane displacements. To eliminate the effects of midplane displacements without introducing new unknown variables, the physical neutral surface is introduced into the CPT. The approximation of the deflection field and the geometric description are performed by using the NURBS basis functions. Compared with the first-order shear deformation theory, the present method has lower memory consumption and higher efficiency. Several numerical results show that the present method yields highly accurate solutions.

  9. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  10. Exploitation of FTA cartridges for the sampling, long-term storage, and DNA-based analyses of plant-parasitic nematodes.

    Science.gov (United States)

    Marek, Martin; Zouhar, Miloslav; Douda, Ondřej; Maňasová, Marie; Ryšánek, Pavel

    2014-03-01

    The use of DNA-based analyses in molecular plant nematology research has dramatically increased over recent decades. Therefore, the development and adaptation of simple, robust, and cost-effective DNA purification procedures are required to address these contemporary challenges. The solid-phase-based approach developed by Flinders Technology Associates (FTA) has been shown to be a powerful technology for the preparation of DNA from different biological materials, including blood, saliva, plant tissues, and various human and plant microbial pathogens. In this work, we demonstrate, for the first time, that this FTA-based technology is a valuable, low-cost, and time-saving approach for the sampling, long-term archiving, and molecular analysis of plant-parasitic nematodes. Despite the complex structure and anatomical organization of the multicellular bodies of nematodes, we report the successful and reliable DNA-based analysis of nematode high-copy and low-copy genes using the FTA technology. This was achieved by applying nematodes to the FTA cards either in the form of a suspension of individuals, as intact or pestle-crushed nematodes, or by the direct mechanical printing of nematode-infested plant tissues. We further demonstrate that the FTA method is also suitable for the so-called "one-nematode-assay", in which the target DNA is typically analyzed from a single individual nematode. More surprisingly, a time-course experiment showed that nematode DNA can be detected specifically in the FTA-captured samples many years after initial sampling occurs. Collectively, our data clearly demonstrate the applicability and the robustness of this FTA-based approach for molecular research and diagnostics concerning phytonematodes; this research includes economically important species such as the stem nematode (Ditylenchus dipsaci), the sugar beet nematode (Heterodera schachtii), and the Northern root-knot nematode (Meloidogyne hapla).

  11. The occurrence of Toxocara malaysiensis in cats in China, confirmed by sequence-based analyses of ribosomal DNA.

    Science.gov (United States)

    Li, Ming-Wei; Zhu, Xing-Quan; Gasser, Robin B; Lin, Rui-Qing; Sani, Rehana A; Lun, Zhao-Rong; Jacobs, Dennis E

    2006-10-01

    Non-isotopic polymerase chain reaction (PCR)-based single-strand conformation polymorphism and sequence analyses of the second internal transcribed spacer (ITS-2) of nuclear ribosomal DNA (rDNA) were utilized to genetically characterise ascaridoids from dogs and cats from China by comparison with those from other countries. The study showed that Toxocara canis, Toxocara cati, and Toxascaris leonina from China were genetically the same as those from other geographical origins. Specimens from cats from Guangzhou, China, which were morphologically consistent with Toxocara malaysiensis, were the same genetically as those from Malaysia, with the exception of a polymorphism in the ITS-2 but no unequivocal sequence difference. This is the first report of T. malaysiensis in cats outside of Malaysia (from where it was originally described), supporting the proposal that this species has a broader geographical distribution. The molecular approach employed provides a powerful tool for elucidating the biology, epidemiology, and zoonotic significance of T. malaysiensis.

  12. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Directory of Open Access Journals (Sweden)

    Lucy Lim

    2016-01-01

    Full Text Available Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices.

  13. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  14. SVM models for analysing the headstreams of mine water inrush

    Energy Technology Data Exchange (ETDEWEB)

    Yan Zhi-gang; Du Pei-jun; Guo Da-zhi [China University of Science and Technology, Xuzhou (China). School of Environmental Science and Spatial Informatics

    2007-08-15

    The support vector machine (SVM) model was introduced to analyse the headstrean of water inrush in a coal mine. The SVM model, based on a hydrogeochemical method, was constructed for recognising two kinds of headstreams and the H-SVMs model was constructed for recognising multi- headstreams. The SVM method was applied to analyse the conditions of two mixed headstreams and the value of the SVM decision function was investigated as a means of denoting the hydrogeochemical abnormality. The experimental results show that the SVM is based on a strict mathematical theory, has a simple structure and a good overall performance. Moreover the parameter W in the decision function can describe the weights of discrimination indices of the headstream of water inrush. The value of the decision function can denote hydrogeochemistry abnormality, which is significant in the prevention of water inrush in a coal mine. 9 refs., 1 fig., 7 tabs.

  15. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  16. Cost consequences due to reduced ulcer healing times - analyses based on the Swedish Registry of Ulcer Treatment.

    Science.gov (United States)

    Öien, Rut F; Forssell, Henrik; Ragnarson Tennvall, Gunnel

    2016-10-01

    Resource use and costs for topical treatment of hard-to-heal ulcers based on data from the Swedish Registry of Ulcer Treatment (RUT) were analysed in patients recorded in RUT as having healed between 2009 and 2012, in order to estimate potential cost savings from reductions in frequency of dressing changes and healing times. RUT is used to capture areas of improvement in ulcer care and to enable structured wound management by registering patients with hard-to-heal leg, foot and pressure ulcers. Patients included in the registry are treated in primary care, community care, private care, and inpatient hospital care. Cost calculations were based on resource use data on healing time and frequency of dressing changes in Swedish patients with hard-to-heal ulcers who healed between 2009 and 2012. Per-patient treatment costs decreased from SEK38 223 in 2009 to SEK20 496 in 2012, mainly because of shorter healing times. Frequency of dressing changes was essentially the same during these years, varying from 1·4 to 1·6 per week. The total healing time was reduced by 38%. Treatment costs for the management of hard-to-heal ulcers can be reduced with well-developed treatment strategies resulting in shortened healing times as shown in RUT. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  17. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  18. Teleseism-based Relative Time Corrections for Modern Analyses of Digitized Analog Seismograms

    Science.gov (United States)

    Lee, T. A.; Ishii, M.

    2017-12-01

    With modern-day instruments and seismic networks timed by GPS systems, synchronization of data streams is all but a forgone conclusion. However, during the analog era, when each station had its own clock, comparing data timing from different stations was a far more daunting prospect. Today, with recently developed methods by which analog data can be digitized, having the ability to accurately reconcile the timings of two separate stations would open decades worth of data to modern analyses. For example, one possible and exciting application would be using noise interferometry with digitized analog data in order to investigate changing structural features (on a volcano for example) over a much longer timescale than was previously possible. With this in mind, we introduce a new approach to sync time between stations based on teleseismic arrivals. P-wave arrivals are identified at stations for pairs of earthquakes from the digital and analog eras that have nearly identical distances, locations, and depths. Assuming accurate timing of the modern data, relative time corrections between a pair of stations can then be inferred for the analog data. This method for time correction depends upon the analog stations having modern equivalents, and both having sufficiently long durations of operation to allow for recording of usable teleseismic events. The Hawaii Volcano Observatory (HVO) network is an especially ideal environment for this, as it not only has a large and well-preserved collection of analog seismograms, but also has a long operating history (1912 - present) with many of the older stations having modern equivalents. As such, the scope of this project is to calculate and apply relative time corrections to analog data from two HVO stations, HILB (1919-present) and UWE (1928-present)(HILB now part of Pacific Tsunami network). Further application of this method could be for investigation of the effects of relative clock-drift, that is, the determining factor for how

  19. Chemical analyses of rocks, minerals, and detritus, Yucca Mountain--Preliminary report, special report No. 11

    International Nuclear Information System (INIS)

    Hill, C.A.; Livingston, D.E.

    1993-09-01

    This chemical analysis study is part of the research program of the Yucca Mountain Project intended to provide the State of Nevada with a detailed assessment of the geology and geochemistry of Yucca Mountain and adjacent regions. This report is preliminary in the sense that more chemical analyses may be needed in the future and also in the sense that these chemical analyses should be considered as a small part of a much larger geological data base. The interpretations discussed herein may be modified as that larger data base is examined and established. All of the chemical analyses performed to date are shown in Table 1. There are three parts to this table: (1) trace element analyses on rocks (limestone and tuff) and minerals (calcite/opal), (2) rare earth analyses on rocks (tuff) and minerals (calcite/opal), and (3) major element analyses + CO 2 on rocks (tuff) and detritus sand. In this report, for each of the three parts of the table, the data and its possible significance will be discussed first, then some overall conclusions will be made, and finally some recommendations for future work will be offered

  20. Review of Ontario Hydro Pickering 'A' and Bruce 'A' nuclear generating stations' accident analyses

    International Nuclear Information System (INIS)

    Serdula, K.J.

    1988-01-01

    Deterministic safety analysis for the Pickering 'A' and Bruce 'A' nuclear generating stations were reviewed. The methodology used in the evaluation and assessment was based on the concept of 'N' critical parameters defining an N-dimensional safety parameter space. The reviewed accident analyses were evaluated and assessed based on their demonstrated safety coverage for credible values and trajectories of the critical parameters within this N-dimensional safety parameter space. The reported assessment did not consider probability of occurrence of event. The reviewed analyses were extensive for potential occurrence of accidents under normal steady-state operating conditions. These analyses demonstrated an adequate assurance of safety for the analyzed conditions. However, even for these reactor conditions, items have been identified for consideration of review and/or further study, which would provide a greater assurance of safety in the event of an accident. Accident analyses based on a plant in a normal transient operating state or in an off-normal condition but within the allowable operating envelope are not as extensive. Improvements in demonstrations and/or justifications of safety upon potential occurrence of accidents would provide further assurance of adequacy of safety under these conditions. Some events under these conditions have not been analyzed because of their judged low probability; however, accident analyses in this area should be considered. Recommendations are presented relating to these items; it is also recommended that further study is needed of the Pickering 'A' special safety systems

  1. A Fourier transform infrared trace gas and isotope analyser for atmospheric applications

    Directory of Open Access Journals (Sweden)

    D. W. T. Griffith

    2012-10-01

    Full Text Available Concern in recent decades about human impacts on Earth's climate has led to the need for improved and expanded measurement capabilities of greenhouse gases in the atmosphere. In this paper we describe in detail an in situ trace gas analyser based on Fourier Transform Infrared (FTIR spectroscopy that is capable of simultaneous and continuous measurements of carbon dioxide (CO2, methane (CH4, carbon monoxide (CO, nitrous oxide (N2O and 13C in CO2 in air with high precision. High accuracy is established by reference to measurements of standard reference gases. Stable water isotopes can also be measured in undried airstreams. The analyser is automated and allows unattended operation with minimal operator intervention. Precision and accuracy meet and exceed the compatibility targets set by the World Meteorological Organisation – Global Atmosphere Watch for baseline measurements in the unpolluted troposphere for all species except 13C in CO2.

    The analyser is mobile and well suited to fixed sites, tower measurements, mobile platforms and campaign-based measurements. The isotopic specificity of the optically-based technique and analysis allows its application in isotopic tracer experiments, for example in tracing variations of 13C in CO2 and 15N in N2O. We review a number of applications illustrating use of the analyser in clean air monitoring, micrometeorological flux and tower measurements, mobile measurements on a train, and soil flux chamber measurements.

  2. Neutronic analyses and tools development efforts in the European DEMO programme

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, U., E-mail: ulrich.fischer@kit.edu [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Bachmann, C. [European Fusion Development Agreement (EFDA), Garching (Germany); Bienkowska, B. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Catalan, J.P. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Drozdowicz, K.; Dworak, D. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Leichtle, D. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Fusion for Energy (F4E), Barcelona (Spain); Lengar, I. [MESCS-JSI, Ljubljana (Slovenia); Jaboulay, J.-C. [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Lu, L. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Moro, F. [Associazione ENEA-Euratom, ENEA Fusion Division, Frascati (Italy); Mota, F. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Sanz, J. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Szieberth, M. [Budapest University of Technology and Economics (BME), Budapest (Hungary); Palermo, I. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Pampin, R. [Fusion for Energy (F4E), Barcelona (Spain); Porton, M. [Euratom/CCFE Fusion Association, Culham Science Centre for Fusion Energy (CCFE), Culham (United Kingdom); Pereslavtsev, P. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Ogando, F. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Rovni, I. [Budapest University of Technology and Economics (BME), Budapest (Hungary); and others

    2014-10-15

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools.

  3. Neutronic analyses and tools development efforts in the European DEMO programme

    International Nuclear Information System (INIS)

    Fischer, U.; Bachmann, C.; Bienkowska, B.; Catalan, J.P.; Drozdowicz, K.; Dworak, D.; Leichtle, D.; Lengar, I.; Jaboulay, J.-C.; Lu, L.; Moro, F.; Mota, F.; Sanz, J.; Szieberth, M.; Palermo, I.; Pampin, R.; Porton, M.; Pereslavtsev, P.; Ogando, F.; Rovni, I.

    2014-01-01

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools

  4. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  5. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  6. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  7. Groupe d'analyses et politiques économiques de Tunisie (GAPET ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Tunisia's recent political transformation has provided an opportunity to promote debate on key economic and development issues. This project will support the creation of an independent economic policy group: Groupe d'analyses et politiques économiques de Tunisie. The group will promote evidence-based policy and ...

  8. Swiss-Slovak cooperation program: a training strategy for safety analyses

    International Nuclear Information System (INIS)

    Husarcek, J.

    2000-01-01

    During the 1996-1999 period, a new training strategy for safety analyses was implemented at the Slovak Nuclear Regulatory Authority (UJD) within the Swiss-Slovak cooperation programme in nuclear safety (SWISSLOVAK). The SWISSLOVAK project involved the recruitment, training, and integration of the newly established team into UJD's organizational structure. The training strategy consisted primarily of the following two elements: a) Probabilistic Safety Analysis (PSA) applications (regulatory review and technical evaluation of Level-1/Level-2 PSAs; PSA-based operational events analysis, PSA applications to assessment of Technical Specifications; and PSA-based hardware and/or procedure modifications) and b) Deterministic accident analyses (analysis of accidents and regulatory review of licensee Safety Analysis Reports; analysis of severe accidents/radiological releases and the potential impact of the containment and engineered safety systems, including the development of technical bases for emergency response planning; and application of deterministic methods for evaluation of accident management strategies/procedure modifications). The paper discusses the specific aspects of the training strategy performed at UJD in both the probabilistic and deterministic areas. The integration of team into UJD's organizational structure is described and examples of contributions of the team to UJD's statutory responsibilities are provided. (author)

  9. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  10. PALEO-CHANNELS OF SINGKAWANG WATERS WEST KALIMANTAN AND ITS RELATION TO THE OCCURRENCES OF SUB-SEABOTTOM GOLD PLACERS BASED ON STRATA BOX SEISMIC RECORD ANALYSES

    Directory of Open Access Journals (Sweden)

    Hananto Kurnio

    2017-07-01

    Full Text Available Strata box seismic records were used to analyze sub-seabottom paleochannels in Singkawang Waters, West Kalimantan. Based on the analyses, it can be identified the distribution and patterns of paleochannels. Paleo channel at northern part of study area interpreted as a continuation of Recent coastal rivers; and at the southern part, the pattern radiates surround the cone-shaped morphology of islands, especially Kabung and Lemukutan Islands. Paleochannels of the study area belong to northwest Sunda Shelf systems that terminated to the South China Sea. A study on sequence stratigraphy was carried out to better understanding sedimentary sequences in the paleochannels. This study is also capable of identifying placer deposits within the channels. Based on criterias of gold placer occurrence such as existence of primary gold sources, intense chemical and physical weathering to liberate gold grains from their source rocks of Sintang Intrusive. Gravity transportation that involved water media, stable bed rock and surface conditions, caused offshore area of Singkawang fulfill requirements for gold placer accumulations. Chemical and physical whethering proccesses from Oligocene to Recent, approximately 36 million, might be found accumulation of gold placer on the seafloor. Based on grain size analyses, the study area consisted of sand 43.4%, silt 54.3% and clay 2.3%. Petrographic examination of the sample shows gold grains about 0.2%.

  11. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  12. A protein relational database and protein family knowledge bases to facilitate structure-based design analyses.

    Science.gov (United States)

    Mobilio, Dominick; Walker, Gary; Brooijmans, Natasja; Nilakantan, Ramaswamy; Denny, R Aldrin; Dejoannis, Jason; Feyfant, Eric; Kowticwar, Rupesh K; Mankala, Jyoti; Palli, Satish; Punyamantula, Sairam; Tatipally, Maneesh; John, Reji K; Humblet, Christine

    2010-08-01

    The Protein Data Bank is the most comprehensive source of experimental macromolecular structures. It can, however, be difficult at times to locate relevant structures with the Protein Data Bank search interface. This is particularly true when searching for complexes containing specific interactions between protein and ligand atoms. Moreover, searching within a family of proteins can be tedious. For example, one cannot search for some conserved residue as residue numbers vary across structures. We describe herein three databases, Protein Relational Database, Kinase Knowledge Base, and Matrix Metalloproteinase Knowledge Base, containing protein structures from the Protein Data Bank. In Protein Relational Database, atom-atom distances between protein and ligand have been precalculated allowing for millisecond retrieval based on atom identity and distance constraints. Ring centroids, centroid-centroid and centroid-atom distances and angles have also been included permitting queries for pi-stacking interactions and other structural motifs involving rings. Other geometric features can be searched through the inclusion of residue pair and triplet distances. In Kinase Knowledge Base and Matrix Metalloproteinase Knowledge Base, the catalytic domains have been aligned into common residue numbering schemes. Thus, by searching across Protein Relational Database and Kinase Knowledge Base, one can easily retrieve structures wherein, for example, a ligand of interest is making contact with the gatekeeper residue.

  13. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0

    Directory of Open Access Journals (Sweden)

    Jing Cong

    2015-09-01

    Full Text Available To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171.

  14. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    Science.gov (United States)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  15. VIPRE modeling of VVER-1000 reactor core for DNB analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Y.; Nguyen, Q. [Westinghouse Electric Corporation, Pittsburgh, PA (United States); Cizek, J. [Nuclear Research Institute, Prague, (Czech Republic)

    1995-09-01

    Based on the one-pass modeling approach, the hot channels and the VVER-1000 reactor core can be modeled in 30 channels for DNB analyses using the VIPRE-01/MOD02 (VIPRE) code (VIPRE is owned by Electric Power Research Institute, Palo Alto, California). The VIPRE one-pass model does not compromise any accuracy in the hot channel local fluid conditions. Extensive qualifications include sensitivity studies of radial noding and crossflow parameters and comparisons with the results from THINC and CALOPEA subchannel codes. The qualifications confirm that the VIPRE code with the Westinghouse modeling method provides good computational performance and accuracy for VVER-1000 DNB analyses.

  16. Using Microsoft Office Excel 2007 to conduct generalized matching analyses.

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.

  17. USING MICROSOFT OFFICE EXCEL® 2007 TO CONDUCT GENERALIZED MATCHING ANALYSES

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law. PMID:20514196

  18. Weight analyses and nitrogen balance assay in rats fed extruded ...

    African Journals Online (AJOL)

    Weight analyses and nitrogen balance assay in adult rats in raw and extruded African breadfruit (Treculia africana) based diets were carried out using response surface methodology in a central composite design. Process variables were feed composition (40 - 100 % African breadfruit, 0 - 5 % corn and 0 - 55 % soybean, ...

  19. Use of results of microbiological analyses for risk-based control of Listeria monocytogenes in marinated broiler legs.

    Science.gov (United States)

    Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura

    2008-02-10

    Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk

  20. Analyses of karyotypes and comparative physical locations of the ...

    African Journals Online (AJOL)

    The frequencies of signal detection of the marker, RG556 and the BAC clone, 44B4, were 8.0 and 41.3% in O. sativa, while 9.0 and 42.3% in O. officinalis, respectively. Based on a comparative RFLP map of a wild rice, O. officinalis and O. sativa, comparative analyses of karyotypes of O. officinalis were demonstrated firstly ...

  1. Design factors analyses of second-loop PRHRS

    Directory of Open Access Journals (Sweden)

    ZHANG Hongyan

    2017-05-01

    Full Text Available In order to study the operating characteristics of a second-loop Passive Residual Heat Removal System (PRHRS, the transient thermal analysis code RELAP5 is used to build simulation models of the main coolant system and second-loop PRHRS. Transient calculations and comparative analyses under station blackout accident and one-side feed water line break accident conditions are conducted for three critical design factors of the second-loop PRHRS:design capacity, emergency makeup tank and isolation valve opening speed. The impacts of the discussed design factors on the operating characteristics of the second-loop PRHRS are summarized based on calculations and analyses. The analysis results indicate that the system safety and cooling rate should be taken into consideration in designing PRHRS's capacity,and water injection from emergency makeup tank to steam generator can provide advantage to system cooling in the event of accident,and system startup performance can be improved by reducing the opening speed of isolation valve. The results can provide references for the design of the second-loop PRHRS in nuclear power plants.

  2. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    Buck, John W.; McDonald, John P.; Taira, Randal Y.

    2002-01-01

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  3. Peak-flow frequency analyses and results based on data through water year 2011 for selected streamflow-gaging stations in or near Montana: Chapter C in Montana StreamStats

    Science.gov (United States)

    Sando, Steven K.; McCarthy, Peter M.; Dutton, DeAnn M.

    2016-04-05

    Chapter C of this Scientific Investigations Report documents results from a study by the U.S. Geological Survey, in cooperation with the Montana Department of Transportation and the Montana Department of Natural Resources, to provide an update of statewide peak-flow frequency analyses and results for Montana. The purpose of this report chapter is to present peak-flow frequency analyses and results for 725 streamflow-gaging stations in or near Montana based on data through water year 2011. The 725 streamflow-gaging stations included in this study represent nearly all streamflowgaging stations in Montana (plus some from adjacent states or Canadian Provinces) that have at least 10 years of peak-flow records through water year 2011. For 29 of the 725 streamflow-gaging stations, peak-flow frequency analyses and results are reported for both unregulated and regulated conditions. Thus, peak-flow frequency analyses and results are reported for a total of 754 analyses. Estimates of peak-flow magnitudes for 66.7-, 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities are reported. These annual exceedance probabilities correspond to 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals.

  4. Analysing Leontiev Tube Capabilities in the Space-based Plants

    Directory of Open Access Journals (Sweden)

    N. L. Shchegolev

    2017-01-01

    Full Text Available The paper presents a review of publications dedicated to the gas-dynamic temperature stratification device (the Leontief tube and shows main factors affecting its efficiency. Describes an experimental installation, which is used to obtain data on the value of energy separation in the air to prove this device the operability.The assumption that there is an optimal relationship between the flow velocities in the subsonic and supersonic channels of the gas-dynamic temperature stratification device is experimentally confirmed.The paper conducts analysis of possible ways to raise the efficiency of power plants of various (including space basing, and shows that, currently, a mainstream of increasing efficiency of their operation is to complicate design solutions.A scheme of the closed gas-turbine space-based plant using a mixture of inert gases (helium-xenon one for operation is proposed. What differs it from the simplest variants is a lack of the cooler-radiator and integration into gas-dynamic temperature stratification device and heat compressor.Based on the equations of one-dimensional gas dynamics, it is shown that the total pressure restorability when removing heat in a thermal compressor determines operating capability of this scheme. The exploratory study of creating a heat compressor is performed, and it is shown that when operating on gases with a Prandtl number close to 1, the total pressure does not increase.The operating capability conditions of the heat compressor are operation on gases with a low value of the Prandtl number (helium-xenon mixture at high supersonic velocities and with a longitudinal pressure gradient available.It is shown that there is a region of the low values of the Prandtl number (Pr <0.3 for which, with the longitudinal pressure gradient available in the supersonic flows of a viscous gas, the total pressure can be restored.

  5. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  6. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  7. Economic evaluation of algae biodiesel based on meta-analyses

    Science.gov (United States)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  8. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  9. PALSfit3: A software package for analysing positron lifetime spectra

    DEFF Research Database (Denmark)

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which hav...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...

  10. Conformational determination of [Leu]enkephalin based on theoretical and experimental VA and VCD spectral analyses

    DEFF Research Database (Denmark)

    Abdali, Salim; Jalkanen, Karl J.; Cao, X.

    2004-01-01

    Conformational determination of [Leu]enkephalin in DMSO-d6 is carried out using VA and VCD spectral analyses. Conformational energies, vibrational frequencies and VA and VCD intensities are calculated using DFT at B3LYP/6-31G* level of theory. Comparison between the measured spectra...

  11. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  12. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  14. Engineering analyses of ITER divertor diagnostic rack design

    Energy Technology Data Exchange (ETDEWEB)

    Modestov, Victor S., E-mail: modestov@compmechlab.com [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Nemov, Alexander S.; Borovkov, Aleksey I.; Buslakov, Igor V.; Lukin, Aleksey V. [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Kochergin, Mikhail M.; Mukhin, Eugene E.; Litvinov, Andrey E.; Koval, Alexandr N. [Ioffe Physico-Technical Institute, 194021 St Petersburg, 26 Polytechnicheskaya (Russian Federation); Andrew, Philip [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: • The approach developed early has been used for the assessment of new design of DTS racks and neutron shield units. • Results of most critical EM and seismic analyses indicate that introduced changes significantly improved the system behaviour under these loads. • However further research is required to finalize the design and check it upon meeting all structural, thermal, seismic, EM and fatigue requirements. -- Abstract: The divertor port racks used as a support structure of the divertor Thomson scattering equipment has been carefully analyzed to be consistent with electromagnetic and seismic loads. It follows from the foregoing simulations that namely these analyses demonstrate critical challenges associated with the structure design. Based on the results of the reference structure [2] a modified design of the diagnostic racks is proposed and updated simulation results are given. The results signify a significant improvement over the previous reference layout and the design will be continued towards finalization.

  15. Energy and exergy analyses of electrolytic hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnic Univ., Toronto, ON (Canada). Dept. of Mechanical Engineering

    1995-07-01

    The thermodynamic performance is investigated of a water-electrolysis process for producing hydrogen, based on current-technology equipment. Both energy and exergy analyses are used. Three cases are considered in which the principal driving energy inputs are (i) electricity, (ii) the high-temperature heat used to generate the electricity, and (iii) the heat source used to produce the high-temperature heat. The nature of the heat source (e.g.) fossil fuel, nuclear fuel, solar energy, (etc.) is left as general as possible. The analyses indicate that, when the main driving input is the hypothetical heat source, the principal thermodynamic losses are associated with water splitting, electricity generation and heat production; the losses are mainly due to the irreversibilities associated with converting a heat source to heat, and heat transfer across large temperature differences. The losses associated with the waste heat in used cooling water, because of its low quality, are not as significant as energy analysis indicates. (Author)

  16. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake

    International Nuclear Information System (INIS)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi; Yamaguchi, Hiroo; Kira, Jun-ichi

    2017-01-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of 123 I-MIBG uptake in patients with PD. (orig.)

  17. Improving correlations between MODIS aerosol optical thickness and ground-based PM 2.5 observations through 3D spatial analyses

    Science.gov (United States)

    Hutchison, Keith D.; Faruqui, Shazia J.; Smith, Solar

    The Center for Space Research (CSR) continues to focus on developing methods to improve correlations between satellite-based aerosol optical thickness (AOT) values and ground-based, air pollution observations made at continuous ambient monitoring sites (CAMS) operated by the Texas commission on environmental quality (TCEQ). Strong correlations and improved understanding of the relationships between satellite and ground observations are needed to formulate reliable real-time predictions of air quality using data accessed from the moderate resolution imaging spectroradiometer (MODIS) at the CSR direct-broadcast ground station. In this paper, improvements in these correlations are demonstrated first as a result of the evolution in the MODIS retrieval algorithms. Further improvement is then shown using procedures that compensate for differences in horizontal spatial scales between the nominal 10-km MODIS AOT products and CAMS point measurements. Finally, airborne light detection and ranging (lidar) observations, collected during the Texas Air Quality Study of 2000, are used to examine aerosol profile concentrations, which may vary greatly between aerosol classes as a result of the sources, chemical composition, and meteorological conditions that govern transport processes. Further improvement in correlations is demonstrated with this limited dataset using insights into aerosol profile information inferred from the vertical motion vectors in a trajectory-based forecast model. Analyses are ongoing to verify these procedures on a variety of aerosol classes using data collected by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite (Calipso) lidar.

  18. Exergy and energy analyses of two different types of PCM based thermal management systems for space air conditioning applications

    International Nuclear Information System (INIS)

    Tyagi, V.V.; Pandey, A.K.; Buddhi, D.; Tyagi, S.K.

    2013-01-01

    Highlights: ► Calcium chloride hexahydrate (CaCl 2 ⋅6H 2 O) as a PCM was used in this study. ► Two different capsulated system (HDPE based panel and balls) were designed. ► The results of CaCl 2 ⋅6H 2 O are very attractive for space air conditioning. ► Energy and exergy analyses for space cooling applications. - Abstract: This communication presents the experimental study of PCM based thermal management systems for space heating and cooling applications using energy and exergy analysis. Two different types of based thermal management system (TMS-I and TMS-II) using calcium chloride hexahydrate as the heat carrier has been designed, fabricated and studied for space heating and cooling applications at a typical climatic zone in India. In the first experimental arrangement the charging of PCM has been carried out with air conditioning system while discharging has been carried out using electric heater for both the thermal management systems. While in the second arrangement the charging of PCM has been carried out by solar energy and the discharging has been carried out by circulating the cooler ambient air during the night time. In the first experiment, TMS-I is found to be more effective than that of TMS-II while it was found to be reverse in the case of second experiment for both the charging and discharging processes not only for energetic but also for the exergetic performances

  19. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  20. Combined analyses of costs, market value and eco-costs in circular business models : eco-efficient value creation in remanufacturing

    NARCIS (Netherlands)

    Vogtländer, J.G.; Scheepens, A.E.; Bocken, N.M.P.; Peck, D.P.

    2017-01-01

    Eco-efficient Value Creation is a method to analyse innovative product and service design together with circular business strategies. The method is based on combined analyses of the costs, market value (perceived customer value) and eco-costs. This provides a prevention-based single indicator for

  1. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  2. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  3. Gamma spectrometric analyses of environmental samples at PINSTECH

    International Nuclear Information System (INIS)

    Faruq, M.U.; Parveen, N.; Ahmed, B.; Aziz, A.

    1979-01-01

    Gamma spectrometric analyses of air and other environmental samples from PINSTECH were carried out. Air particulate samples were analyzed by a Ge(Li) detector on a computer-based multichannel analyzer. Other environmental samples were analyzed by a Na(T1) scintillation detector spectrometer and a multichannel analyzer with manual analysis. Concentration of radionuclides in the media was determined and the sources of their production were identified. Age of the fall out was estimated from the ratios of the fission products. (authors)

  4. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    OpenAIRE

    Zhigang Zuo; Shuhong Liu; Yizhang Fan; Yulin Wu

    2014-01-01

    It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were...

  5. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    Directory of Open Access Journals (Sweden)

    Hoľko Michal

    2014-12-01

    Full Text Available The article deals with numerical analyses of a Continuous Flight Auger (CFA pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed.

  6. Fully plastic crack opening analyses of complex-cracked pipes for Ramberg-Osgood materials

    International Nuclear Information System (INIS)

    Jeong, Jae Uk; Choi, Jae Boong; Huh, Nam Su; Kim, Yun Jae

    2016-01-01

    The plastic influence functions for calculating fully plastic Crack opening displacement (COD) of complex-cracked pipes were newly proposed based on systematic 3-dimensional (3-D) elastic-plastic Finite element (FE) analyses using Ramberg-Osgood (R-O) relation, where global bending moment, axial tension and internal pressure are considered separately as a loading condition. Then, crack opening analyses were performed based on GE/EPRI concept by using the new plastic influence functions for complex-cracked pipes made of SA376 TP304 stainless steel, and the predicted CODs were compared with FE results based on deformation plasticity theory of tensile material behavior. From the comparison, the confidence of the proposed fully plastic crack opening solutions for complex-cracked pipes was gained. Therefore, the proposed engineering scheme for COD estimation using the new plastic influence functions can be utilized to estimate leak rate of a complex-cracked pipe for R-O material.

  7. Application of insights from the IREP analyses to the IREP procedures guide

    International Nuclear Information System (INIS)

    Carlson, D.D.; Murphy, J.A.; Young, J.

    1982-01-01

    One of the objectives of the Interim Reliability Evaluation Program (IREP) was to prepare a set of procedures based on experience gained in the study for use in future IREP-type analyses. The current analyses used a set of procedures and, over the course of the program, a concerted effort was made to develop insights which could improve these procedures. Insights have been gained into the organization and content of th procedures guide, into the performance and management of an IREP analysis, and into the methods to be used in the analysis

  8. Stress analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1990-02-01

    The International Thermonuclear Experimental Reactor (ITER) is intended as an experimental thermonuclear tokamak reactor for testing the basic physics, performance and technologies essential to future fusion reactors. The ITER design will be based on extensive new design work, supported by new physical and technological results, and on the great body of experience built up over several years from previous national and international reactor studies. Conversely, the ITER design process should provide the fusion community with valuable insights into what key areas need further development or clarification as we move forward towards practical fusion power. As part of the design process of the ITER toroidal field coils the mechanical behaviour of the magnetic system under fault conditions has to be analysed in more detail. This paper describes the work carried out to create a detailed finite element model of two toroidal field coils as well as some results of linear elastic analyses with fault conditions. The analyses have been performed with the finite element code ANSYS. (author). 5 refs.; 8 figs.; 2 tabs

  9. The interrelation between hypothyroidism and glaucoma: a critical review and meta-analyses.

    Science.gov (United States)

    Thvilum, Marianne; Brandt, Frans; Brix, Thomas Heiberg; Hegedüs, Laszlo

    2017-12-01

    Data on the association between hypothyroidism and glaucoma are conflicting. We sought to shed light on this by conducting a critical review and meta-analyses. The meta-analyses were conducted in adherence with the widely accepted MOOSE guidelines. Using the Medical Subject Heading (MeSH) terms: hypothyroidism, myxoedema and glaucoma or intraocular pressure, case-control studies, cohort studies and cross-sectional studies were identified (PubMed) and reviewed. Using meta-analysis, the relative risk (RR) of coexistence of glaucoma and hypothyroidism was calculated. Based on the literature search, thirteen studies fulfilled the inclusion criteria and could be categorized into two groups based on the exposure. The designs of the studies varied considerably, and there was heterogeneity related to lack of power, weak phenotype classifications and length of follow-up. Eight studies had glaucoma (5757 patients) as exposure and hypothyroidism as outcome. Among these, we found a non-significantly increased risk of hypothyroidism associated with glaucoma (RR 1.65; 95% confidence interval [CI]: 0.97-2.82). Based on five studies (168 006 patients) with hypothyroidism as exposure and glaucoma as outcome, we found the risk of glaucoma to be significantly increased (RR 1.33; 95% CI: 1.13-1.58). Based on these meta-analyses, there seems to be an association between hypothyroidism and glaucoma, which does not seem to be the case between glaucoma and hypothyroidism. However, larger scale studies with better phenotype classification, longer follow-up and taking comorbidity and other biases into consideration are needed to address a potential causal relationship. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  10. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  11. Combining Conversation Analysis and Nexus Analysis to analyse sociomaterial and affective practices

    DEFF Research Database (Denmark)

    Raudaskoski, Pirkko Liisa

    2016-01-01

    of resemiotization (Iedema 2000). Within organization and design studies, materiality has become a focus in the increasingly popular sociomaterial approach to everyday practices (e.g. Orlikowski 2007). Some sociomaterial scholars (e.g. Sørensen 2013) analyse ethnographic data either as evidence for the sociomaterial....... The analytical effort is to get to the senses and sensations which are regarded as opposite of sense-making. In my presentation, I go through some of my own analyses from various institutional interactions to show how CA-based multimodal analyses of local interactional (or intra-actional) trajectories combined...... configuration, also as an interdisciplinary offer for an analytic package that might help sociomaterial researchers of practices come even closer to the situation at hand as an assemblage out of which materials, humans and experiences emerge....

  12. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  13. Functional Analysis in Public Schools: A Summary of 90 Functional Analyses

    Science.gov (United States)

    Mueller, Michael M.; Nkosi, Ajamu; Hine, Jeffrey F.

    2011-01-01

    Several review and epidemiological studies have been conducted over recent years to inform behavior analysts of functional analysis outcomes. None to date have closely examined demographic and clinical data for functional analyses conducted exclusively in public school settings. The current paper presents a data-based summary of 90 functional…

  14. Teacher Interviews, Student Interviews, and Classroom Observations in Combinatorics: Four Analyses

    Science.gov (United States)

    Caddle, Mary C.

    2012-01-01

    This research consists of teacher interviews, student interviews, and classroom observations, all based around the mathematical content area of combinatorics. Combinatorics is a part of discrete mathematics concerning the ordering and grouping of distinct elements. The data are used in four separate analyses. The first provides evidence that…

  15. Grid Mapping for Spatial Pattern Analyses of Recurrent Urban Traffic Congestion Based on Taxi GPS Sensing Data

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-03-01

    Full Text Available Traffic congestion is one of the most serious problems that impact urban transportation efficiency, especially in big cities. Identifying traffic congestion locations and occurring patterns is a prerequisite for urban transportation managers in order to take proper countermeasures for mitigating traffic congestion. In this study, the historical GPS sensing data of about 12,000 taxi floating cars in Beijing were used for pattern analyses of recurrent traffic congestion based on the grid mapping method. Through the use of ArcGIS software, 2D and 3D maps of the road network congestion were generated for traffic congestion pattern visualization. The study results showed that three types of traffic congestion patterns were identified, namely: point type, stemming from insufficient capacities at the nodes of the road network; line type, caused by high traffic demand or bottleneck issues in the road segments; and region type, resulting from multiple high-demand expressways merging and connecting to each other. The study illustrated that the proposed method would be effective for discovering traffic congestion locations and patterns and helpful for decision makers to take corresponding traffic engineering countermeasures in order to relieve the urban traffic congestion issues.

  16. Post test analyses of Revisa benchmark based on a creep test at 1100 Celsius degrees performed on a notched tube

    International Nuclear Information System (INIS)

    Fischer, M.; Bernard, A.; Bhandari, S.

    2001-01-01

    In the Euratom 4. Framework Program of the European Commission, REVISA Project deals with the Reactor Vessel Integrity under Severe Accidents. One of the tasks consists in the experimental validation of the models developed in the project. To do this, a benchmark was designed where the participants use their models to test the results against an experiment. The experiment called RUPTHER 15 was conducted by the coordinating organisation, CEA (Commissariat a l'Energie Atomique) in France. It is a 'delayed fracture' test on a notched tube. Thermal loading is an axial gradient with a temperature of about 1130 C in the mid-part. Internal pressure is maintained at 0.8 MPa. This paper presents the results of Finite Element calculations performed by Framatome-ANP using the SYSTUS code. Two types of analyses were made: -) one based on the 'time hardening' Norton-Bailey creep law, -) the other based on the coupled creep/damage Lemaitre-Chaboche model. The purpose of this paper is in particular to show the influence of temperature on the simulation results. At high temperatures of the kind dealt with here, slight errors in the temperature measurements can lead to very large differences in the deformation behaviour. (authors)

  17. Linkage and related analyses of Barrett's esophagus and its associated adenocarcinomas.

    Science.gov (United States)

    Sun, Xiangqing; Elston, Robert; Falk, Gary W; Grady, William M; Faulx, Ashley; Mittal, Sumeet K; Canto, Marcia I; Shaheen, Nicholas J; Wang, Jean S; Iyer, Prasad G; Abrams, Julian A; Willis, Joseph E; Guda, Kishore; Markowitz, Sanford; Barnholtz-Sloan, Jill S; Chandar, Apoorva; Brock, Wendy; Chak, Amitabh

    2016-07-01

    Familial aggregation and segregation analysis studies have provided evidence of a genetic basis for esophageal adenocarcinoma (EAC) and its premalignant precursor, Barrett's esophagus (BE). We aim to demonstrate the utility of linkage analysis to identify the genomic regions that might contain the genetic variants that predispose individuals to this complex trait (BE and EAC). We genotyped 144 individuals in 42 multiplex pedigrees chosen from 1000 singly ascertained BE/EAC pedigrees, and performed both model-based and model-free linkage analyses, using S.A.G.E. and other software. Segregation models were fitted, from the data on both the 42 pedigrees and the 1000 pedigrees, to determine parameters for performing model-based linkage analysis. Model-based and model-free linkage analyses were conducted in two sets of pedigrees: the 42 pedigrees and a subset of 18 pedigrees with female affected members that are expected to be more genetically homogeneous. Genome-wide associations were also tested in these families. Linkage analyses on the 42 pedigrees identified several regions consistently suggestive of linkage by different linkage analysis methods on chromosomes 2q31, 12q23, and 4p14. A linkage on 15q26 is the only consistent linkage region identified in the 18 female-affected pedigrees, in which the linkage signal is higher than in the 42 pedigrees. Other tentative linkage signals are also reported. Our linkage study of BE/EAC pedigrees identified linkage regions on chromosomes 2, 4, 12, and 15, with some reported associations located within our linkage peaks. Our linkage results can help prioritize association tests to delineate the genetic determinants underlying susceptibility to BE and EAC.

  18. Limitations of Species Delimitation Based on Phylogenetic Analyses: A Case Study in the Hypogymnia hypotrypa Group (Parmeliaceae, Ascomycota.

    Directory of Open Access Journals (Sweden)

    Xinli Wei

    Full Text Available Delimiting species boundaries among closely related lineages often requires a range of independent data sets and analytical approaches. Similar to other organismal groups, robust species circumscriptions in fungi are increasingly investigated within an empirical framework. Here we attempt to delimit species boundaries in a closely related clade of lichen-forming fungi endemic to Asia, the Hypogymnia hypotrypa group (Parmeliaceae. In the current classification, the Hypogymnia hypotrypa group includes two species: H. hypotrypa and H. flavida, which are separated based on distinctive reproductive modes, the former producing soredia but absent in the latter. We reexamined the relationship between these two species using phenotypic characters and molecular sequence data (ITS, GPD, and MCM7 sequences to address species boundaries in this group. In addition to morphological investigations, we used Bayesian clustering to identify potential genetic groups in the H. hypotrypa/H. flavida clade. We also used a variety of empirical, sequence-based species delimitation approaches, including: the "Automatic Barcode Gap Discovery" (ABGD, the Poisson tree process model (PTP, the General Mixed Yule Coalescent (GMYC, and the multispecies coalescent approach BPP. Different species delimitation scenarios were compared using Bayes factors delimitation analysis, in addition to comparisons of pairwise genetic distances, pairwise fixation indices (FST. The majority of the species delimitation analyses implemented in this study failed to support H. hypotrypa and H. flavida as distinct lineages, as did the Bayesian clustering analysis. However, strong support for the evolutionary independence of H. hypotrypa and H. flavida was inferred using BPP and further supported by Bayes factor delimitation. In spite of rigorous morphological comparisons and a wide range of sequence-based approaches to delimit species, species boundaries in the H. hypotrypa group remain uncertain

  19. Morphological analyses suggest a new taxonomic circumscription for Hymenaea courbaril L. (Leguminosae, Caesalpinioideae).

    Science.gov (United States)

    Souza, Isys Mascarenhas; Funch, Ligia Silveira; de Queiroz, Luciano Paganucci

    2014-01-01

    Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae) with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, Hymenaea courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous) leaf characters in order to assess the taxonomy of Hymenaea courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA) based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA) were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1) for the major groups recovered using UPGMA and PCA, and (2) for the varieties. All analyses recovered three major groups coincident with (1) var. altissima, (2) var. longifolia, and (3) all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed.

  20. Morphological analyses suggest a new taxonomic circumscription for Hymenaea courbaril L. (Leguminosae, Caesalpinioideae

    Directory of Open Access Journals (Sweden)

    Isys Souza

    2014-06-01

    Full Text Available Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, H. courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous leaf characters in order to assess the taxonomy of H. courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1 for the major groups recovered using UPGMA and PCA, and (2 for the varieties. All analyses recovered three major groups coincident with (1 var. altissima, (2 var. longifolia, and (3 all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed.

  1. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  2. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-15

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the

  3. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    International Nuclear Information System (INIS)

    2009-11-01

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the canister is said to

  4. ATHENA/INTRA analyses for ITER, NSSR-2

    International Nuclear Information System (INIS)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A.

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes

  5. ATHENA/INTRA analyses for ITER, NSSR-2

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes 8 refs, 14 figs, 15 tabs

  6. Reporting the results of meta-analyses: a plea for incorporating clinical relevance referring to an example.

    Science.gov (United States)

    Bartels, Ronald H M A; Donk, Roland D; Verhagen, Wim I M; Hosman, Allard J F; Verbeek, André L M

    2017-11-01

    The results of meta-analyses are frequently reported, but understanding and interpreting them is difficult for both clinicians and patients. Statistical significances are presented without referring to values that imply clinical relevance. This study aimed to use the minimal clinically important difference (MCID) to rate the clinical relevance of a meta-analysis. This study is a review of the literature. This study is a review of meta-analyses relating to a specific topic, clinical results of cervical arthroplasty. The outcome measure used in the study was the MCID. We performed an extensive literature search of a series of meta-analyses evaluating a similar subject as an example. We searched in Pubmed and Embase through August 9, 2016, and found articles concerning meta-analyses of the clinical outcome of cervical arthroplasty compared with that of anterior cervical discectomy with fusion in cases of cervical degenerative disease. We evaluated the analyses for statistical significance and their relation to MCID. MCID was defined based on results in similar patient groups and a similar disease entity reported in the literature. We identified 21 meta-analyses, only one of which referred to MCID. However, the researchers used an inappropriate measurement scale and, therefore, an incorrect MCID. The majority of the conclusions were based on statistical results without mentioning clinical relevance. The majority of the articles we reviewed drew conclusions based on statistical differences instead of clinical relevance. We recommend introducing the concept of MCID while reporting the results of a meta-analysis, as well as mentioning the explicit scale of the analyzed measurement. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses

    Science.gov (United States)

    Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.

    2014-01-01

    Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294

  8. Understanding ageing in older Australians: The contribution of the Dynamic Analyses to Optimise Ageing (DYNOPTA) project to the evidenced base and policy

    Science.gov (United States)

    Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D

    2014-01-01

    Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767

  9. Relationships between Mathematics Teacher Preparation and Graduates' Analyses of Classroom Teaching

    Science.gov (United States)

    Hiebert, James; Berk, Dawn; Miller, Emily

    2017-01-01

    The purpose of this longitudinal study was to investigate the relationships between mathematics teacher preparation and graduates' analyses of classroom teaching. Fifty-three graduates from an elementary teacher preparation program completed 4 video-based, analysis-of-teaching tasks in the semester before graduation and then in each of the 3…

  10. Hospitable Gestures in the University Lecture: Analysing Derrida's Pedagogy

    Science.gov (United States)

    Ruitenberg, Claudia

    2014-01-01

    Based on archival research, this article analyses the pedagogical gestures in Derrida's (largely unpublished) lectures on hospitality (1995/96), with particular attention to the enactment of hospitality in these gestures. The motivation for this analysis is twofold. First, since the large-group university lecture has been widely critiqued as…

  11. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  12. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Science.gov (United States)

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  13. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Directory of Open Access Journals (Sweden)

    Arika Ligmann-Zielinska

    Full Text Available Agent-based models (ABMs have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1 efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2 conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  14. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  15. The moral economy of austerity: analysing UK welfare reform.

    Science.gov (United States)

    Morris, Lydia

    2016-03-01

    This paper notes the contemporary emergence of 'morality' in both sociological argument and political rhetoric, and analyses its significance in relation to ongoing UK welfare reforms. It revisits the idea of 'moral economy' and identifies two strands in its contemporary application; that all economies depend on an internal moral schema, and that some external moral evaluation is desirable. UK welfare reform is analysed as an example of the former, with reference to three distinct orientations advanced in the work of Freeden (1996), Laclau (2014), and Lockwood (1996). In this light, the paper then considers challenges to the reform agenda, drawn from third sector and other public sources. It outlines the forms of argument present in these challenges, based respectively on rationality, legality, and morality, which together provide a basis for evaluation of the welfare reforms and for an alternative 'moral economy'. © London School of Economics and Political Science 2016.

  16. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Embodied energy and emergy analyses of a concentrating solar power (CSP) system

    International Nuclear Information System (INIS)

    Zhang Meimei; Wang Zhifeng; Xu Chao; Jiang Hui

    2012-01-01

    Although concentrating solar power (CSP) technology has been projected as one of the most promising candidates to replace conventional power plants burning fossil fuels, the potential advantages and disadvantages of the CSP technology have not been thoroughly evaluated. To better understand the performance of the CSP technology, this paper presents an ecological accounting framework based on embodied energy and emergy analyses methods. The analyses are performed for the 1.5 MW Dahan solar tower power plant in Beijing, China and different evaluation indices used in the embodied energy and emergy analyses are employed to evaluate the plant performance. Our analysis of the CSP plant are compared with six Italian power plants with different energy sources and an American PV plant, which demonstrates the CSP is the superior technology. - Highlights: ► Embodied energy and emergy analyses are employed to evaluate the first solar tower power plant in China. ► Different evaluation indices are quantitatively analyzed to show the advantages of CSP technology. ► This analysis provides insights for making energy policy and investment decisions about CSP technology.

  18. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on {sup 123}I-MIBG uptake

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi [Kyushu University, Department of Clinical Radiology, Graduate School of Medical Sciences, Fukuoka (Japan); Yamaguchi, Hiroo; Kira, Jun-ichi [Kyushu University, Department of Neurology, Graduate School of Medical Sciences, Fukuoka (Japan)

    2017-12-15

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and {sup 123}I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and {sup 123}I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar {sup 123}I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of {sup 123}I-MIBG uptake in patients with PD. (orig.)

  19. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  20. Use of probabilistic safety analyses in severe accident management

    International Nuclear Information System (INIS)

    Neogy, P.; Lehner, J.

    1991-01-01

    An important consideration in the development and assessment of severe accident management strategies is that while the strategies are often built on the knowledge base of Probabilistic Safety Analyses (PSA), they must be interpretable and meaningful in terms of the control room indicators. In the following, the relationships between PSA and severe accident management are explored using ex-vessel accident management at a PWR ice-condenser plant as an example. 2 refs., 1 fig., 3 tabs

  1. Development of SI-traceable C-peptide certified reference material NMIJ CRM 6901-a using isotope-dilution mass spectrometry-based amino acid analyses.

    Science.gov (United States)

    Kinumi, Tomoya; Goto, Mari; Eyama, Sakae; Kato, Megumi; Kasama, Takeshi; Takatsu, Akiko

    2012-07-01

    A certified reference material (CRM) is a higher-order calibration material used to enable a traceable analysis. This paper describes the development of a C-peptide CRM (NMIJ CRM 6901-a) by the National Metrology Institute of Japan using two independent methods for amino acid analysis based on isotope-dilution mass spectrometry. C-peptide is a 31-mer peptide that is utilized for the evaluation of β-cell function in the pancreas in clinical testing. This CRM is a lyophilized synthetic peptide having the human C-peptide sequence, and contains deamidated and pyroglutamylated forms of C-peptide. By adding water (1.00 ± 0.01) g into the vial containing the CRM, the C-peptide solution in 10 mM phosphate buffer saline (pH 6.6) is reconstituted. We assigned two certified values that represent the concentrations of total C-peptide (mixture of C-peptide, deamidated C-peptide, and pyroglutamylated C-peptide) and C-peptide. The certified concentration of total C-peptide was determined by two amino acid analyses using pre-column derivatization liquid chromatography-mass spectrometry and hydrophilic chromatography-mass spectrometry following acid hydrolysis. The certified concentration of C-peptide was determined by multiplying the concentration of total C-peptide by the ratio of the relative area of C-peptide to that of the total C-peptide measured by liquid chromatography. The certified value of C-peptide (80.7 ± 5.0) mg/L represents the concentration of the specific entity of C-peptide; on the other hand, the certified value of total C-peptide, (81.7 ± 5.1) mg/L can be used for analyses that does not differentiate deamidated and pyroglutamylated C-peptide from C-peptide itself, such as amino acid analyses and immunochemical assays.

  2. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  3. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. Safety and sensitivity analyses of a generic geologic disposal system for high-level radioactive waste

    International Nuclear Information System (INIS)

    Kimura, Hideo; Takahashi, Tomoyuki; Shima, Shigeki; Matsuzuru, Hideo

    1994-11-01

    This report describes safety and sensitivity analyses of a generic geologic disposal system for HLW, using a GSRW code and an automated sensitivity analysis methodology based on the Differential Algebra. An exposure scenario considered here is based on a normal evolution scenario which excludes events attributable to probabilistic alterations in the environment. The results of sensitivity analyses indicate that parameters related to a homogeneous rock surrounding a disposal facility have higher sensitivities to the output analyzed here than those of a fractured zone and engineered barriers. The sensitivity analysis methodology provides technical information which might be bases for the optimization of design of the disposal facility. Safety analyses were performed on the reference disposal system which involve HLW in amounts corresponding to 16,000 MTU of spent fuels. The individual dose equivalent due to the exposure pathway ingesting drinking water was calculated using both the conservative and realistic values of geochemical parameters. In both cases, the committed dose equivalent evaluated here is the order of 10 -7 Sv, and thus geologic disposal of HLW may be feasible if the disposal conditions assumed here remain unchanged throughout the periods assessed here. (author)

  5. Safety analyses of the nuclear-powered ship Mutsu with RETRAN

    International Nuclear Information System (INIS)

    Naruko, Y.; Ishida, T.; Tanaka, Y.; Futamura, Y.

    1982-01-01

    To provide a quantitative basis for the safety evaluation of the N.S. Mutsu, a number of safety analyses were performed in the course of reexamination. With respect to operational transient analyses, the RETRAN computer code was used to predict plant performances on the basis of postulated transient scenarios. The COBRA-IV computer code was also used to obtain a value of the minimum DNBR for each transient, which is necessary to predict detailed thermal-hydraulic performances in the core region of the reactor. In the present paper, the following three operational transients, which were calculated as a part of the safety analyses, are being dealt with: a complete loss of load without reactor scram; an excessive load increase incident, which is followed by a 30 percent stepwise load increase in the steam dump flow; and an accidental depressurization of the primary system, which is followed by a sudden full opening of the pressurizer spray valve. A Mutsu two-loop RETRAN model and simulation results were described. The results being compared with those of land-based PWRs, the characteristic features of the Mutsu reactor were presented and the safety of the plant under the operational transient conditions was confirmed

  6. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  7. A review of experiments and computer analyses on RIAs

    International Nuclear Information System (INIS)

    Jernkvist, L.O.; Massih, A.R.; In de Betou, J.

    2010-01-01

    Reactivity initiated accidents (RIAs) are nuclear reactor accidents that involve an unwanted increase in fission rate and reactor power. Reactivity initiated accidents in power reactors may occur as a result of reactor control system failures, control element ejections or events caused by rapid changes in temperature or pressure of the coolant/moderator. our current understanding of reactivity initiated accidents and their consequences is based largely on three sources of information: 1) best-estimate computer analyses of the reactor response to postulated accident scenarios, 2) pulse-irradiation tests on instrumented fuel rodlets, carried out in research reactors, 3) out-of-pile separate effect tests, targeted to explore key phenomena under RIA conditions. In recent years, we have reviewed, compiled and analysed these 3 categories of data. The results is a state-of-the-art report on fuel behaviour under RIA conditions, which is currently being published by the OECD Nuclear Energy Agency. The purpose of this paper is to give a brief summary of this report

  8. Shinguards effective in preventing lower leg injuries in football: Population-based trend analyses over 25 years.

    Science.gov (United States)

    Vriend, Ingrid; Valkenberg, Huib; Schoots, Wim; Goudswaard, Gert Jan; van der Meulen, Wout J; Backx, Frank J G

    2015-09-01

    The majority of football injuries are caused by trauma to the lower extremities. Shinguards are considered an important measure in preventing lower leg impact abrasions, contusions and fractures. Given these benefits, Fédération Internationale de Football Association introduced the shinguard law in 1990, which made wearing shinguards during matches mandatory. This study evaluated the effect of the introduction of the shinguard law for amateur players in the Netherlands in the 1999/2000-football season on the incidence of lower leg injuries. Time trend analyses on injury data covering 25 years of continuous registration (1986-2010). Data were retrieved from a system that records all emergency department treatments in a random, representative sample of Dutch hospitals. All injuries sustained in football by patients aged 6-65 years were included, except for injuries of the Achilles tendon and Weber fractures. Time trends were analysed with multiple regression analyses; a model was fitted consisting of multiple straight lines, each representing a 5-year period. Patients were predominantly males (92%) and treated for fractures (48%) or abrasions/contusions (52%) to the lower leg. The incidence of lower leg football injuries decreased significantly following the introduction of the shinguard law (1996-2000: -20%; 2001-2005: -25%), whereas the incidence of all other football injuries did not. This effect was more prominent at weekends/match days. No gender differences were found. The results significantly show a preventive effect of the shinguard law underlining the relevance of rule changes as a preventive measure and wearing shinguards during both matches and training sessions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  9. Analysing relations between specific and total liking scores

    DEFF Research Database (Denmark)

    Menichelli, Elena; Kraggerud, Hilde; Olsen, Nina Veflen

    2013-01-01

    The objective of this article is to present a new statistical approach for the study of consumer liking. Total liking data are extended by incorporating liking for specific sensory properties. The approach combines different analyses for the purpose of investigating the most important aspects...... of liking and indicating which products are similarly or differently perceived by which consumers. A method based on the differences between total liking and the specific liking variables is proposed for studying both relative differences among products and individual consumer differences. Segmentation...... is also tested out in order to distinguish consumers with the strongest differences in their liking values. The approach is illustrated by a case study, based on cheese data. In the consumer test consumers were asked to evaluate their total liking, the liking for texture and the liking for odour/taste. (C...

  10. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  11. PCR and RFLP analyses based on the ribosomal protein operon

    Science.gov (United States)

    Differentiation and classification of phytoplasmas have been primarily based on the highly conserved 16Sr RNA gene. RFLP analysis of 16Sr RNA gene sequences has identified 31 16Sr RNA (16Sr) groups and more than 100 16Sr subgroups. Classification of phytoplasma strains can however, become more refin...

  12. Detection and analyse of hazardous roads in rural areas

    DEFF Research Database (Denmark)

    Sørensen, Michael

    2003-01-01

    For the last period of 5-10 years the notion "Grey roads" (hazardous roads) has appeared in Danish traffic safety work and improvement of these roads has become a very important part of the traffic safety work in many countries. The problem is, that the notion never has been clearly defined......, and therefore there are no unambiguos methods to point out and analyse "Grey roads". In this article based on a ph.D.-project a method to detecting "Grey roads" is introduced....

  13. Development of generic soil profiles and soil data development for SSI analyses

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Josh, E-mail: jparker@nuscalepower.com [NuScale Power, 1000 NE Circle Boulevard, Suite 10310, Corvallis, OR 97330 (United States); Khan, Mohsin; Rajagopal, Raj [ARES Corporation, 1990N California Boulevard, Suite 500, Walnut Creek, CA 94596 (United States); Groome, John [NuScale Power, 1000 NE Circle Boulevard, Suite 10310, Corvallis, OR 97330 (United States)

    2014-04-01

    This paper presents the approach to developing generic soil profiles for the design of reactor building for small modular reactor (SMR) nuclear power plant developed by NuScale Power. The reactor building is a deeply embedded structure. In order to perform soil structure interaction (SSI) analyses, generic soil profiles are required to be defined for the standardized Nuclear Power Plant (NPP) designs for the United States Nuclear Regulatory Commission (NRC) in a design control document (DCD). The development of generic soil profiles is based on utilization of information on generic soil profiles from the new standardized nuclear power plant designs already submitted to the NRC for license certification. Eleven generic soil profiles have been recommended, and those profiles cover a wide range of parameters such as soil depth, shear wave velocity, unit weight, Poisson's ratio, water table, and depth to rock strata. The soil profiles are developed for a range of shear wave velocities between bounds of 1000 fps and 8000 fps as inferred from NRC Standard Review Plan (NUREG 0800) Sections 3.7.1 and 3.7.2. To account for the soil degradation due to seismic events, the strain compatible soil properties are based on the EPRI generic soil degradation curves. In addition, one dimensional soil dynamic response analyses were performed to study the soil layer input motions for performing the SSI analyses.

  14. Stuttering, induced fluency, and natural fluency: a hierarchical series of activation likelihood estimation meta-analyses.

    Science.gov (United States)

    Budde, Kristin S; Barron, Daniel S; Fox, Peter T

    2014-12-01

    Developmental stuttering is a speech disorder most likely due to a heritable form of developmental dysmyelination impairing the function of the speech-motor system. Speech-induced brain-activation patterns in persons who stutter (PWS) are anomalous in various ways; the consistency of these aberrant patterns is a matter of ongoing debate. Here, we present a hierarchical series of coordinate-based meta-analyses addressing this issue. Two tiers of meta-analyses were performed on a 17-paper dataset (202 PWS; 167 fluent controls). Four large-scale (top-tier) meta-analyses were performed, two for each subject group (PWS and controls). These analyses robustly confirmed the regional effects previously postulated as "neural signatures of stuttering" (Brown, Ingham, Ingham, Laird, & Fox, 2005) and extended this designation to additional regions. Two smaller-scale (lower-tier) meta-analyses refined the interpretation of the large-scale analyses: (1) a between-group contrast targeting differences between PWS and controls (stuttering trait); and (2) a within-group contrast (PWS only) of stuttering with induced fluency (stuttering state). Copyright © 2014 Elsevier Inc. All rights reserved.

  15. The Use of Statistical Process Control Tools for Analysing Financial Statements

    Directory of Open Access Journals (Sweden)

    Niezgoda Janusz

    2017-06-01

    Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.

  16. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  17. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  18. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  19. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  20. Sediment Characteristics of Mergui Basin, Andaman Sea based on Multi-proxy Analyses

    Directory of Open Access Journals (Sweden)

    Rina Zuraida

    2018-02-01

    Full Text Available This paper presents the characteristics of sediment from core BS-36 (6°55.85’ S and 96°7.48’ E, 1147.1 m water depth that was acquired in the Mergui Basin, Andaman Sea. The analyses involved megascopic description, core scanning by multi-sensor core logger, and carbonate content measurement. The purpose of this study is to determine the physical and chemical characteristics of sediment to infer the depositional environment. The results show that this core can be divided into 5 lithologic units that represent various environmental conditions. The sedimentation of the bottom part, Units V and IV were inferred to be deposited in suboxic to anoxic bottom condition combined with high productivity and low precipitation. Unit III was deposited during high precipitation and oxic condition due to ocean ventilation. In the upper part, Units II and I occurred during higher precipitation, higher carbonate production and suboxic to anoxic condition. Keywords: sediment characteristics, Mergui Basin, Andaman Sea, suboxic, anoxic, oxic, carbonate content

  1. Atmospheric Mining in the Outer Solar System: Outer Planet Orbital Transfer and Lander Analyses

    Science.gov (United States)

    Palaszewski, Bryan

    2016-01-01

    Atmospheric mining in the outer solar system has been investigated as a means of fuel production for high energy propulsion and power. Fusion fuels such as Helium 3 (3He) and deuterium can be wrested from the atmospheres of Uranus and Neptune and either returned to Earth or used in-situ for energy production. Helium 3 and deuterium were the primary gases of interest with hydrogen being the primary propellant for nuclear thermal solid core and gas core rocket-based atmospheric flight. A series of analyses were undertaken to investigate resource capturing aspects of atmospheric mining in the outer solar system. This included the gas capturing rate, storage options, and different methods of direct use of the captured gases. While capturing 3He, large amounts of hydrogen and 4He are produced. Analyses of orbital transfer vehicles (OTVs), landers, and the issues with in-situ resource utilization (ISRU) mining factories are included. Preliminary observations are presented on near-optimal selections of moon base orbital locations, OTV power levels, and OTV and lander rendezvous points. For analyses of round trip OTV flights from Uranus to Miranda or Titania, a 10- Megawatt electric (MWe) OTV power level and a 200 metricton (MT) lander payload were selected based on a relative short OTV trip time and minimization of the number of lander flights. A similar optimum power level is suggested for OTVs flying from low orbit around Neptune to Thalassa or Triton. Several moon base sites at Uranus and Neptune and the OTV requirements to support them are also addressed.

  2. Simulation of the Impact of New Aircraft-and Satellite-based Ocean Surface Wind Measurements on Wind Analyses and Numerical Forecasts

    Science.gov (United States)

    Miller, TImothy; Atlas, Robert; Black, Peter; Chen, Shuyi; Jones, Linwood; Ruf, Chris; Uhlhorn, Eric; Gamache, John; Amarin, Ruba; El-Nimri, Salem; hide

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center, NOAA Hurricane Research Division, the University of Central Florida and the University of Michigan. HIRAD is being designed to enhance the realtime airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational airborne Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx. 3 x the aircraft altitude). The present paper describes a set of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct H*Wind analyses, a product of the Hurricane Research Division of NOAA s Atlantic Oceanographic and Meteorological Laboratory. Evaluations will be presented on the impact of the HIRAD instrument on H*Wind analyses, both in terms of adding it to the full suite of current measurements, as well as using it to replace instrument(s) that may not be functioning at the future time the HIRAD instrument is implemented. Also shown will be preliminary results of numerical weather prediction OSSEs in which the impact of the addition of HIRAD observations to the initial state on numerical forecasts of the hurricane intensity and structure is assessed.

  3. Approaches for cytogenetic and molecular analyses of small flow-sorted cell populations from childhood leukemia bone marrow samples

    DEFF Research Database (Denmark)

    Obro, Nina Friesgaard; Madsen, Hans O.; Ryder, Lars Peter

    2011-01-01

    defined cell populations with subsequent analyses of leukemia-associated cytogenetic and molecular marker. The approaches described here optimize the use of the same tube of unfixed, antibody-stained BM cells for flow-sorting of small cell populations and subsequent exploratory FISH and PCR-based analyses....

  4. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic - Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences.

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered.

  5. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    Science.gov (United States)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  6. Methodology for processing pressure traces used as inputs for combustion analyses in diesel engines

    International Nuclear Information System (INIS)

    Rašić, Davor; Vihar, Rok; Baškovič, Urban Žvar; Katrašnik, Tomaž

    2017-01-01

    This study proposes a novel methodology for designing an optimum equiripple finite impulse response (FIR) filter for processing in-cylinder pressure traces of a diesel internal combustion engine, which serve as inputs for high-precision combustion analyses. The proposed automated workflow is based on an innovative approach of determining the transition band frequencies and optimum filter order. The methodology is based on discrete Fourier transform analysis, which is the first step to estimate the location of the pass-band and stop-band frequencies. The second step uses short-time Fourier transform analysis to refine the estimated aforementioned frequencies. These pass-band and stop-band frequencies are further used to determine the most appropriate FIR filter order. The most widely used existing methods for estimating the FIR filter order are not effective in suppressing the oscillations in the rate- of-heat-release (ROHR) trace, thus hindering the accuracy of combustion analyses. To address this problem, an innovative method for determining the order of an FIR filter is proposed in this study. This method is based on the minimization of the integral of normalized signal-to-noise differences between the stop-band frequency and the Nyquist frequency. Developed filters were validated using spectral analysis and calculation of the ROHR. The validation results showed that the filters designed using the proposed innovative method were superior compared with those using the existing methods for all analyzed cases. Highlights • Pressure traces of a diesel engine were processed by finite impulse response (FIR) filters with different orders • Transition band frequencies were determined with an innovative method based on discrete Fourier transform and short-time Fourier transform • Spectral analyses showed deficiencies of existing methods in determining the FIR filter order • A new method of determining the FIR filter order for processing pressure traces was

  7. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  8. Probabilistic and Nonprobabilistic Sensitivity Analyses of Uncertain Parameters

    Directory of Open Access Journals (Sweden)

    Sheng-En Fang

    2014-01-01

    Full Text Available Parameter sensitivity analyses have been widely applied to industrial problems for evaluating parameter significance, effects on responses, uncertainty influence, and so forth. In the interest of simple implementation and computational efficiency, this study has developed two sensitivity analysis methods corresponding to the situations with or without sufficient probability information. The probabilistic method is established with the aid of the stochastic response surface and the mathematical derivation proves that the coefficients of first-order items embody the parameter main effects on the response. Simultaneously, a nonprobabilistic interval analysis based method is brought forward for the circumstance when the parameter probability distributions are unknown. The two methods have been verified against a numerical beam example with their accuracy compared to that of a traditional variance-based method. The analysis results have demonstrated the reliability and accuracy of the developed methods. And their suitability for different situations has also been discussed.

  9. Analysing trade-offs between milk, feed and manure production on Dutch dairy farms

    NARCIS (Netherlands)

    Samson, Sabrina; Gardebroek, C.; Jongeneel, R.A.

    2017-01-01

    The abolition of milk quota fuels environmental concerns in the Netherlands. A microeconomic model is developed to analyse the technical relations between milk, roughage and manure production. Production functions for milk, feed and roughage are estimated based on milk quota and manure constraints.

  10. Improved phylogenetic analyses corroborate a plausible position of Martialis heureka in the ant tree of life.

    Directory of Open Access Journals (Sweden)

    Patrick Kück

    Full Text Available Martialinae are pale, eyeless and probably hypogaeic predatory ants. Morphological character sets suggest a close relationship to the ant subfamily Leptanillinae. Recent analyses based on molecular sequence data suggest that Martialinae are the sister group to all extant ants. However, by comparing molecular studies and different reconstruction methods, the position of Martialinae remains ambiguous. While this sister group relationship was well supported by Bayesian partitioned analyses, Maximum Likelihood approaches could not unequivocally resolve the position of Martialinae. By re-analysing a previous published molecular data set, we show that the Maximum Likelihood approach is highly appropriate to resolve deep ant relationships, especially between Leptanillinae, Martialinae and the remaining ant subfamilies. Based on improved alignments, alignment masking, and tree reconstructions with a sufficient number of bootstrap replicates, our results strongly reject a placement of Martialinae at the first split within the ant tree of life. Instead, we suggest that Leptanillinae are a sister group to all other extant ant subfamilies, whereas Martialinae branch off as a second lineage. This assumption is backed by approximately unbiased (AU tests, additional Bayesian analyses and split networks. Our results demonstrate clear effects of improved alignment approaches, alignment masking and data partitioning. We hope that our study illustrates the importance of thorough, comprehensible phylogenetic analyses using the example of ant relationships.

  11. SeeSway - A free web-based system for analysing and exploring standing balance data.

    Science.gov (United States)

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate

  12. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    Science.gov (United States)

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Detailed semantic analyses of human error incidents occurring at domestic nuclear power plants to fiscal year 2000

    International Nuclear Information System (INIS)

    Tsuge, Tadashi; Hirotsu, Yuko; Takano, Kenichi; Ebisu, Mitsuhiro; Tsumura, Joji

    2003-01-01

    Analysing and evaluating observed cases of human error incidents with the emphasis on human factors and behavior involved was essential for preventing recurrence of those. CRIEPI has been conducting detailed and structures analyses of all incidents reported during last 35 year based on J-HPES, from the beginning of the first Tokai nuclear power operation till fiscal year of 2000, in which total 212 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES data-base. This summarized the semantic analyses on all case-studies stored in the above data-base to grasp the practical and concrete contents and trend of more frequently observed human errors (as are called trigger actions here), causal factors and preventive measures. These semantic analyses have been executed by classifying all those items into some categories that could be considered as having almost the same meaning using the KJ method. Followings are obtained typical results by above analyses: (1) Trigger action-Those could be classified into categories of operation or categories of maintenance. Operational timing errors' and 'operational quantitative errors' were major actions in trigger actions of operation, those occupied about 20% among all actions. At trigger actions of maintenance, 'maintenance quantitative error' were major actions, those occupied quarter among all actions; (2) Causal factor- 'Human internal status' were major factors, as in concrete factors, those occupied 'improper persistence' and 'lack of knowledge'; (3) Preventive measure-Most frequent measures got were job management changes in procedural software improvements, which was from 70% to 80%. As for preventive measures of operation, software improvements have been implemented on 'organization and work practices' and 'individual consciousness'. Concerning preventive measures of maintenance, improvements have been implemented on 'organization and work practices'. (author)

  14. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  15. Cooper's Taxonomy of Literature Reviews Applied to Meta-Analyses in Educational Achievement.

    Science.gov (United States)

    Sipe, Theresa Ann; Stallings, William M.

    H. M. Cooper (1988) has developed a taxonomy that classified literature reviews based on six characteristics: (1) focus of attention; (2) goal of the synthesis; (3) perspective on the literature; (4) coverage of the literature; (5) organization of the perspective; and (6) intended audience. One hundred and three meta-analyses identified from the…

  16. Analysing Culture and Interculture in Saudi EFL Textbooks: A Corpus Linguistic Approach

    Science.gov (United States)

    Almujaiwel, Sultan

    2018-01-01

    This paper combines corpus processing tools to investigate the cultural elements of Saudi education of English as a foreign language (EFL). The latest Saudi EFL textbooks (2016 onwards) are available in researchable PDF formats. This helps process them through corpus search software tools. The method adopted is based on analysing 20 cultural…

  17. Detection of T790M, the acquired resistance EGFR mutation, by tumor biopsy versus noninvasive blood-based analyses

    Science.gov (United States)

    Sundaresan, Tilak K.; Sequist, Lecia V.; Heymach, John V.; Riely, Gregory J.; Jänne, Pasi A.; Koch, Walter H.; Sullivan, James P.; Fox, Douglas B.; Maher, Robert; Muzikansky, Alona; Webb, Andrew; Tran, Hai T.; Giri, Uma; Fleisher, Martin; Yu, Helena A.; Wei, Wen; Johnson, Bruce E.; Barber, Thomas A.; Walsh, John R.; Engelman, Jeffrey A.; Stott, Shannon L.; Kapur, Ravi; Maheswaran, Shyamala; Toner, Mehmet

    2015-01-01

    Purpose The T790M gatekeeper mutation in the Epidermal Growth Factor Receptor (EGFR) is acquired by some EGFR-mutant non-small cell lung cancers (NSCLC) as they become resistant to selective tyrosine kinase inhibitors (TKIs). As third generation EGFR TKIs that overcome T790M-associated resistance become available, noninvasive approaches to T790M detection will become critical to guide management. Experimental Design As part of a multi-institutional Stand-Up-To-Cancer collaboration, we performed an exploratory analysis of 40 patients with EGFR-mutant tumors progressing on EGFR TKI therapy. We compared the T790M genotype from tumor biopsies with analysis of simultaneously collected circulating tumor cells (CTC) and circulating tumor DNA (ctDNA). Results T790M genotypes were successfully obtained in 30 (75%) tumor biopsies, 28 (70%) CTC samples and 32 (80%) ctDNA samples. The resistance-associated mutation was detected in 47–50% of patients using each of the genotyping assays, with concordance among them ranging from 57–74%. While CTC- and ctDNA-based genotyping were each unsuccessful in 20–30% of cases, the two assays together enabled genotyping in all patients with an available blood sample, and they identified the T790M mutation in 14 (35%) patients in whom the concurrent biopsy was negative or indeterminate. Conclusion Discordant genotypes between tumor biopsy and blood-based analyses may result from technological differences, as well as sampling different tumor cell populations. The use of complementary approaches may provide the most complete assessment of each patient’s cancer, which should be validated in predicting response to T790M-targeted inhibitors. PMID:26446944

  18. The Network of Counterparty Risk: Analysing Correlations in OTC Derivatives.

    Science.gov (United States)

    Nanumyan, Vahan; Garas, Antonios; Schweitzer, Frank

    2015-01-01

    Counterparty risk denotes the risk that a party defaults in a bilateral contract. This risk not only depends on the two parties involved, but also on the risk from various other contracts each of these parties holds. In rather informal markets, such as the OTC (over-the-counter) derivative market, institutions only report their aggregated quarterly risk exposure, but no details about their counterparties. Hence, little is known about the diversification of counterparty risk. In this paper, we reconstruct the weighted and time-dependent network of counterparty risk in the OTC derivatives market of the United States between 1998 and 2012. To proxy unknown bilateral exposures, we first study the co-occurrence patterns of institutions based on their quarterly activity and ranking in the official report. The network obtained this way is further analysed by a weighted k-core decomposition, to reveal a core-periphery structure. This allows us to compare the activity-based ranking with a topology-based ranking, to identify the most important institutions and their mutual dependencies. We also analyse correlations in these activities, to show strong similarities in the behavior of the core institutions. Our analysis clearly demonstrates the clustering of counterparty risk in a small set of about a dozen US banks. This not only increases the default risk of the central institutions, but also the default risk of peripheral institutions which have contracts with the central ones. Hence, all institutions indirectly have to bear (part of) the counterparty risk of all others, which needs to be better reflected in the price of OTC derivatives.

  19. Long-term uranium supply-demand analyses

    International Nuclear Information System (INIS)

    1986-12-01

    It is the intention of this study to investigate the long-term uranium supply demand situation using a number of supply and demand related assumptions. For supply, these assumptions as used in the Resources and Production Projection (RAPP) model include country economic development status, and consequent lead times for exploration and development, uranium development status, country infrastructure, and uranium resources including the Reasonably Assured (RAR), Estimated Additional, Categories I and II, (EAR-I and II) and Speculative Resource categories. The demand assumptions were based on the ''pure'' reactor strategies developed by the NEA Working Party on Nuclear Fuel Cycle Requirements for the 1986 OECD (NEA)/IAEA reports ''Nuclear Energy and its Fuel Cycle: Prospects to 2025''. In addition for this study, a mixed strategy case was computed using the averages of the Plutonium (Pu) burning LWR high, and the improved LWR low cases. It is understandable that such a long-term analysis cannot present hard facts, but it can show which variables may in fact influence the long-term supply-demand situation. It is hoped that results of this study will provide valuable information for planners in the uranium supply and demand fields. Periodical re-analyses with updated data bases will be needed from time to time

  20. LOCA, LOFA and LOVA analyses pertaining to NET/ITER safety design guidance

    International Nuclear Information System (INIS)

    Ebert, E.; Raeder, J.

    1991-01-01

    The analyses presented pertain to loss of coolant accidents (LOCA), loss of coolant flow accidents (LOFA) and loss of vacuum accidents (LOVA). These types of accidents may jeopardise components and plasma vessel integrity and cause radioactivity mobilisation. The analyses reviewed have been performed under the assumption that the plasma facing components are protected by a carbon based armour. Accidental temperatures and pressure transients are quantified, the possibility of reaction products combustion is investigated and worst case accidental public doses are assessed. On this basis, design recommendations are given and design features such as low plasma facing components armour temperatures (on almost the entire surface) and inert gas adjacent to the vacuum vessel have been implemented. (orig.)

  1. Accelerated safety analyses - structural analyses Phase I - structural sensitivity evaluation of single- and double-shell waste storage tanks

    International Nuclear Information System (INIS)

    Becker, D.L.

    1994-11-01

    Accelerated Safety Analyses - Phase I (ASA-Phase I) have been conducted to assess the appropriateness of existing tank farm operational controls and/or limits as now stipulated in the Operational Safety Requirements (OSRs) and Operating Specification Documents, and to establish a technical basis for the waste tank operating safety envelope. Structural sensitivity analyses were performed to assess the response of the different waste tank configurations to variations in loading conditions, uncertainties in loading parameters, and uncertainties in material characteristics. Extensive documentation of the sensitivity analyses conducted and results obtained are provided in the detailed ASA-Phase I report, Structural Sensitivity Evaluation of Single- and Double-Shell Waste Tanks for Accelerated Safety Analysis - Phase I. This document provides a summary of the accelerated safety analyses sensitivity evaluations and the resulting findings

  2. Multicentre evaluation of the new ORTHO VISION® analyser.

    Science.gov (United States)

    Lazarova, E; Scott, Y; van den Bos, A; Wantzin, P; Atugonza, R; Solkar, S; Carpio, N

    2017-10-01

    Implementation of fully automated analysers has become a crucial security step in the blood bank; it reduces human errors, allows standardisation and improves turnaround time (TAT). We aimed at evaluating the ease of use and the efficiency of the ORTHO VISION ® Analyser (VISION) in comparison to the ORTHO AutoVue ® Innova System (AutoVue) in six different laboratories. After initial training and system configuration, VISION was used in parallel to AutoVue following the daily workload, both automates being based on ORTHO BioVue ® System column agglutination technology. Each participating laboratory provided data and scored the training, system configuration, quality control, maintenance and system efficiency. A total of 1049 individual samples were run: 266 forward and reverse grouping and antibody screens with 10 urgent samples, 473 ABD forward grouping and antibody screens with 22 urgent samples, 160 ABD forward grouping, 42 antibody screens and a series of 108 specific case profiles. The VISION instrument was more rapid than the AutoVue with a mean performing test time of 27·9 min compared to 36 min; for various test type comparisons, the TAT data obtained from VISION was shorter than that from AutoVue. Moreover, VISION analysed urgent STAT samples faster. Regarding the ease of use, VISION was intuitive and user friendly. VISION is a robust, reproducible system performing the most types of analytical determinations needed for pre-transfusion testing today, thus accommodating a wide range of clinical needs. VISION brings appreciated new features that could further secure blood transfusions. © 2017 The Authors. Transfusion Medicine published by John Wiley & Sons Ltd on behalf of British Blood Transfusion Society.

  3. Seismic response analyses for reactor facilities at Savannah River

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Xu, J.

    1991-01-01

    The reactor facilities at the Savannah River Plant (SRP) were designed during the 1950's. The original seismic criteria defining the input ground motion was 0.1 G with UBC [uniform building code] provisions used to evaluate structural seismic loads. Later ground motion criteria have defined the free field seismic motion with a 0.2 G ZPA [free field acceleration] and various spectral shapes. The spectral shapes have included the Housner spectra, a site specific spectra, and the US NRC [Nuclear Regulatory Commission] Reg. Guide 1.60 shape. The development of these free field seismic criteria are discussed in the paper. The more recent seismic analyses have been of the following type: fixed base response spectra, frequency independent lumped parameter soil/structure interaction (SSI), frequency dependent lumped parameter SSI, and current state of the art analyses using computer codes such as SASSI. The results from these computations consist of structural loads and floor response spectra (used for piping and equipment qualification). These results are compared in the paper and the methods used to validate the results are discussed. 14 refs., 11 figs

  4. 76 FR 46268 - Notice of Availability of Pest Risk Analyses for the Importation of Fresh Pitaya and Pomegranates...

    Science.gov (United States)

    2011-08-02

    ...] Notice of Availability of Pest Risk Analyses for the Importation of Fresh Pitaya and Pomegranates From.... ACTION: Notice. SUMMARY: We are advising the public that we have prepared pest risk analyses that... for approving the importation of commodities that, based on the findings of a pest- risk analysis, can...

  5. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  6. Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA

    Science.gov (United States)

    Thorndahl, S.; Smith, J. A.; Krajewski, W. F.

    2012-04-01

    During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and

  7. Ain't necessarily so: review and critique of recent meta-analyses of behavioral medicine interventions in health psychology.

    Science.gov (United States)

    Coyne, James C; Thombs, Brett D; Hagedoorn, Mariet

    2010-03-01

    We examined four meta-analyses of behavioral interventions for adults (Dixon, Keefe, Scipio, Perri, & Abernethy, 2007; Hoffman, Papas, Chatkoff, & Kerns, 2007; Irwin, Cole, & Nicassio, 2006; and Jacobsen, Donovan, Vadaparampil, & Small, 2007) that have appeared in the Evidence Based Treatment Reviews section of Health Psychology. Narrative review. We applied the following criteria to each meta-analysis: (1) whether each meta-analysis was described accurately, adequately, and transparently in the article; (2) whether there was an adequate attempt to deal with methodological quality of the original trials; (3) the extent to which the meta-analysis depended on small, underpowered studies; and (4) the extent to which the meta-analysis provided valid and useful evidence-based recommendations. Across the four meta-analyses, we identified substantial problems with the transparency and completeness with which these meta-analyses were reported, as well as a dependence on small, underpowered trials of generally poor quality. Results of our exercise raise questions about the clinical validity and utility of the conclusions of these meta-analyses. Results should serve as a wake up call to prospective authors, reviewers, and end-users of meta-analyses now appearing in the literature. Copyright 2010 APA, all rights reserved.

  8. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.

    Science.gov (United States)

    Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel

    2018-05-22

    Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.

  9. Differentiation of Toxocara canis and Toxocara cati based on PCR-RFLP analyses of rDNA-ITS and mitochondrial cox1 and nad1 regions.

    Science.gov (United States)

    Mikaeili, Fattaneh; Mathis, Alexander; Deplazes, Peter; Mirhendi, Hossein; Barazesh, Afshin; Ebrahimi, Sepideh; Kia, Eshrat Beigom

    2017-09-26

    The definitive genetic identification of Toxocara species is currently based on PCR/sequencing. The objectives of the present study were to design and conduct an in silico polymerase chain reaction-restriction fragment length polymorphism method for identification of Toxocara species. In silico analyses using the DNASIS and NEBcutter softwares were performed with rDNA internal transcribed spacers, and mitochondrial cox1 and nad1 sequences obtained in our previous studies along with relevant sequences deposited in GenBank. Consequently, RFLP profiles were designed and all isolates of T. canis and T. cati collected from dogs and cats in different geographical areas of Iran were investigated with the RFLP method using some of the identified suitable enzymes. The findings of in silico analyses predicted that on the cox1 gene only the MboII enzyme is appropriate for PCR-RFLP to reliably distinguish the two species. No suitable enzyme for PCR-RFLP on the nad1 gene was identified that yields the same pattern for all isolates of a species. DNASIS software showed that there are 241 suitable restriction enzymes for the differentiation of T. canis from T. cati based on ITS sequences. RsaI, MvaI and SalI enzymes were selected to evaluate the reliability of the in silico PCR-RFLP. The sizes of restriction fragments obtained by PCR-RFLP of all samples consistently matched the expected RFLP patterns. The ITS sequences are usually conserved and the PCR-RFLP approach targeting the ITS sequence is recommended for the molecular differentiation of Toxocara species and can provide a reliable tool for identification purposes particularly at the larval and egg stages.

  10. Molecular Characterization of Five Potyviruses Infecting Korean Sweet Potatoes Based on Analyses of Complete Genome Sequences

    Directory of Open Access Journals (Sweden)

    Hae-Ryun Kwak

    2015-12-01

    Full Text Available Sweet potatoes (Ipomea batatas L. are grown extensively, in tropical and temperate regions, and are important food crops worldwide. In Korea, potyviruses, including Sweet potato feathery mottle virus (SPFMV, Sweet potato virus C (SPVC, Sweet potato virus G (SPVG, Sweet potato virus 2 (SPV2, and Sweet potato latent virus (SPLV, have been detected in sweet potato fields at a high (~95% incidence. In the present work, complete genome sequences of 18 isolates, representing the five potyviruses mentioned above, were compared with previously reported genome sequences. The complete genomes consisted of 10,081 to 10,830 nucleotides, excluding the poly-A tails. Their genomic organizations were typical of the Potyvirus genus, including one target open reading frame coding for a putative polyprotein. Based on phylogenetic analyses and sequence comparisons, the Korean SPFMV isolates belonged to the strains RC and O with >98% nucleotide sequence identity. Korean SPVC isolates had 99% identity to the Japanese isolate SPVC-Bungo and 70% identity to the SPFMV isolates. The Korean SPVG isolates showed 99% identity to the three previously reported SPVG isolates. Korean SPV2 isolates had 97% identity to the SPV2 GWB-2 isolate from the USA. Korean SPLV isolates had a relatively low (88% nucleotide sequence identity with the Taiwanese SPLV-TW isolates, and they were phylogenetically distantly related to SPFMV isolates. Recombination analysis revealed that possible recombination events occurred in the P1, HC-Pro and NIa-NIb regions of SPFMV and SPLV isolates and these regions were identified as hotspots for recombination in the sweet potato potyviruses.

  11. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    A full engineering scale Fluidized Bed Steam Reformer (FBSR) system is being used at the Idaho Nuclear Technology and Engineering Center (INTEC) to stabilize acidic Low Activity Waste (LAW) known as Sodium Bearing Waste (SBW). The INTEC facility, known as the Integrated Waste Treatment Unit (IWTU), underwent an Operational Readiness Review (ORR) and a Technology Readiness Assessment (TRA) in March 2014. The IWTU began non-radioactive simulant processing in late 2014 and by January, 2015 ; the IWTU had processed 62,000 gallons of simulant. The facility is currently in a planned outage for inspection of the equipment and will resume processing simulated waste feed before commencing to process 900,000 gallons of radioactive SBW. The SBW acidic waste will be made into a granular FBSR product (carbonate based) for disposal in the Waste Isolation Pilot Plant (WIPP). In the FBSR process calcined coal is used to create a CO2 fugacity to force the waste species to convert to carbonate species. The quality of the coal, which is a feed input, is important because the reactivity, moisture, and volatiles (C,H,N,O, and S) in the coal impact the reactions and control of the mineralizing process in the primary steam reforming vessel, the Denitration and Mineralizing Reformer (DMR). Too much moisture in the coal can require that additional coal be used. However since moisture in the coal is only a small fraction of the moisture from the fluidizing steam this can be self-correcting. If the coal reactivity or heating value is too low then the coal feedrate needs to be adjusted to achieve the desired heat generation. Too little coal and autothermal heat generation in the DMR cannot be sustained and/or the carbon dioxide fugacity will be too low to create the desired carbonate mineral species. Too much coal and excess S and hydroxide species can form. Excess sulfur from coal that (1) is too rich in sulfur or (2) from overfeeding coal can promote wall scale and contribute to corrosion

  12. Toxicity testing and chemical analyses of recycled fibre-based paper for food contact

    DEFF Research Database (Denmark)

    Binderup, Mona-Lise; Pedersen, Gitte Alsing; Vinggaard, Anne

    2002-01-01

    of different qualities as food-contact materials and to Perform a preliminary evaluation of their suitability from a safety point of view, and, second, to evaluate the use of different in vitro toxicity tests for screening of paper and board. Paper produced from three different categories of recycled fibres (B...... of the paper products were extracted with either 99% ethanol or water. Potential migrants in the extracts were identified and semiquantified by GC-1R-MS or GC-HRMS. In parallel to the chemical analyses, a battery of four different in vitro toxicity tests with different endpoints were applied to the same...... was less cytotoxic than the extracts prepared from paper made from recycled fibres, and extracts prepared from C was the most cytotoxic. None of the extracts showed mutagenic activity No conclusion about the oestrogenic activity could be made, because all extracts were cytotoxic to the test organism (yeast...

  13. Non-localization and localization ROC analyses using clinically based scoring

    Science.gov (United States)

    Paquerault, Sophie; Samuelson, Frank W.; Myers, Kyle J.; Smith, Robert C.

    2009-02-01

    We are investigating the potential for differences in study conclusions when assessing the estimated impact of a computer-aided detection (CAD) system on readers' performance. The data utilized in this investigation were derived from a multi-reader multi-case observer study involving one hundred mammographic background images to which fixed-size and fixed-intensity Gaussian signals were added, generating a low- and high-intensity signal sets. The study setting allowed CAD assessment in two situations: when CAD sensitivity was 1) superior or 2) lower than the average reader. Seven readers were asked to review each set in the unaided and CAD-aided reading modes, mark and rate their findings. Using this data, we studied the effect on study conclusion of three clinically-based receiver operating characteristic (ROC) scoring definitions. These scoring definitions included both location-specific and non-location-specific rules. The results showed agreement in the estimated impact of CAD on the overall reader performance. In the study setting where CAD sensitivity is superior to the average reader, the mean difference in AUC between the CAD-aided read and unaided read was 0.049 (95%CIs: -0.027; 0.130) for the image scoring definition that is based on non-location-specific rules, and 0.104 (95%CIs: 0.036; 0.174) and 0.090 (95%CIs: 0.031; 0.155) for image scoring definitions that are based on location-specific rules. The increases in AUC were statistically significant for the location-specific scoring definitions. It was further observed that the variance on these estimates was reduced when using the location-specific scoring definitions compared to that using a non-location-specific scoring definition. In the study setting where CAD sensitivity is equivalent or lower than the average reader, the mean differences in AUC are slightly above 0.01 for all image scoring definitions. These increases in AUC were not statistical significant for any of the image scoring definitions

  14. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    International Nuclear Information System (INIS)

    Moinereau, D.; Faidy, C.; Valeta, M.P.; Bhandari, S.; Guichard, D.

    1997-01-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs

  15. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    Energy Technology Data Exchange (ETDEWEB)

    Moinereau, D [Electricite de France, Dept. MTC, Moret-sur-Loing (France); Faidy, C [Electricite de France, SEPTEN, Villeurbanne (France); Valeta, M P [Commisariat a l` Energie Atomique, Dept. DMT, Gif-sur-Yvette (France); Bhandari, S; Guichard, D [Societe Franco-Americaine de Constructions Atomiques (FRAMATOME), 92 - Paris-La-Defense (France)

    1997-09-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs.

  16. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    Science.gov (United States)

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  17. Caisson Movement Caused by Wave Slamming—a Comparison of ABAQUS and FLAC Analyses

    DEFF Research Database (Denmark)

    Andersen, Lars; Burcharth, Hans F.; Andersen, Thomas Lykke

    2010-01-01

    -difference analysis has been performed by means of the commercial code FLAC. Similarly, ABAQUS has been employed for finiteelement analyses based on linear as well as quadratic spatial interpolations, assuming fully drained conditions and utilizing an elastic–plastic model for the rubble foundation and the seabed...

  18. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  19. iTRAQ-Based Proteomics Analyses of Sterile/Fertile Anthers from a Thermo-Sensitive Cytoplasmic Male-Sterile Wheat with Aegilops kotschyi Cytoplasm

    Directory of Open Access Journals (Sweden)

    Gaoming Zhang

    2018-05-01

    Full Text Available A “two-line hybrid system” was developed, previously based on thermo-sensitive cytoplasmic male sterility in Aegilops kotschyi (K-TCMS, which can be used in wheat breeding. The K-TCMS line exhibits complete male sterility and it can be used to produce hybrid wheat seeds during the normal wheat-growing season; it propagates via self-pollination at high temperatures. Isobaric tags for relative and absolute quantification-based quantitative proteome and bioinformatics analyses of the TCMS line KTM3315A were conducted under different fertility conditions to understand the mechanisms of fertility conversion in the pollen development stages. In total, 4639 proteins were identified, the differentially abundant proteins that increased/decreased in plants with differences in fertility were mainly involved with energy metabolism, starch and sucrose metabolism, phenylpropanoid biosynthesis, protein synthesis, translation, folding, and degradation. Compared with the sterile condition, many of the proteins that related to energy and phenylpropanoid metabolism increased during the anther development stage. Thus, we suggest that energy and phenylpropanoid metabolism pathways are important for fertility conversion in K-TCMS wheat. These findings provide valuable insights into the proteins involved with anther and pollen development, thereby, helping to further understand the mechanism of TCMS in wheat.

  20. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  1. Evaluation Of Plutonium Oxide Destructive Chemical Analyses For Validity Of Original 3013 Container Binning

    International Nuclear Information System (INIS)

    Mcclard, J.; Kessinger, G.

    2010-01-01

    The surveillance program for 3013 containers is based, in part, on the separation of containers into various bins related to potential container failure mechanisms. The containers are assigned to bins based on moisture content and pre-storage estimates of content chemistry. While moisture content is measured during the packaging of each container, chemistry estimates are made by using a combination of process knowledge, packaging data and prompt gamma analyses to establish the moisture and chloride/fluoride content of the materials. Packages with high moisture and chloride/fluoride contents receive more detailed surveillances than packages with less chloride/fluoride and/or moisture. Moisture verification measurements and chemical analyses performed during the surveillance program provided an opportunity to validate the binning process. Validation results demonstrated that the binning effort was generally successful in placing the containers in the appropriate bin for surveillance and analysis.

  2. European passive plant program preliminary safety analyses to support system design

    International Nuclear Information System (INIS)

    Saiu, Gianfranco; Barucca, Luciana; King, K.J.

    1999-01-01

    In 1994, a group of European Utilities, together with Westinghouse and its Industrial Partner GENESI (an Italian consortium including ANSALDO and FIAT), initiated a program designated EPP (European Passive Plant) to evaluate Westinghouse Passive Nuclear Plant Technology for application in Europe. In the Phase 1 of the European Passive Plant Program which was completed in 1996, a 1000 MWe passive plant reference design (EP1000) was established which conforms to the European Utility Requirements (EUR) and is expected to meet the European Safety Authorities requirements. Phase 2 of the program was initiated in 1997 with the objective of developing the Nuclear Island design details and performing supporting analyses to start development of Safety Case Report (SCR) for submittal to European Licensing Authorities. The first part of Phase 2, 'Design Definition' phase (Phase 2A) was completed at the end of 1998, the main efforts being design definition of key systems and structures, development of the Nuclear Island layout, and performing preliminary safety analyses to support design efforts. Incorporation of the EUR has been a key design requirement for the EP1000 form the beginning of the program. Detailed design solutions to meet the EUR have been defined and the safety approach has also been developed based on the EUR guidelines. The present paper describes the EP1000 approach to safety analysis and, in particular, to the Design Extension Conditions that, according to the EUR, represent the preferred method for giving consideration to the Complex Sequences and Severe Accidents at the design stage without including them in the design bases conditions. Preliminary results of some DEC analyses and an overview of the probabilistic safety assessment (PSA) are also presented. (author)

  3. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  4. Status of science and technology with respect of preparation and evaluation of accident analyses and the use of analysis simulators

    International Nuclear Information System (INIS)

    Pointner, Winfried; Cuesta Morales, Alejandra; Draeger, Peer; Hartung, Juergen; Jakubowski, Zygmunt; Meyer, Gerhard; Palazzo, Simone; Moner, Guim Pallas; Perin, Yann; Pasichnyk, Ihor

    2014-07-01

    The scope of the work was to elaborate the prerequisites for short term accident analyses including recommendations for the application of new methodologies and computational procedures and technical aspects of safety evaluation. The following work packages were performed: Knowledge base for best estimate accident analyses; analytical studies on the PWR plant behavior in case of multiple safety system failures; extension and maintenance of the data base for plant specific analysis simulators.

  5. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  6. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    International Nuclear Information System (INIS)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-01

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  7. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    Energy Technology Data Exchange (ETDEWEB)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-15

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  8. 75 FR 52302 - Notice of Availability of Pest Risk Analyses for the Importation of Fresh Celery, Arugula, and...

    Science.gov (United States)

    2010-08-25

    ... Inspection Service [Docket No. APHIS-2010-0074] Notice of Availability of Pest Risk Analyses for the... spinach from Colombia. We are making these pest risk analyses available to the public for review and..., based on the findings of a pest- risk analysis, can be safely imported subject to one or more of the...

  9. Sequencing, Characterization, and Comparative Analyses of the Plastome of Caragana rosea var. rosea

    Directory of Open Access Journals (Sweden)

    Mei Jiang

    2018-05-01

    Full Text Available To exploit the drought-resistant Caragana species, we performed a comparative study of the plastomes from four species: Caragana rosea, C. microphylla, C. kozlowii, and C. Korshinskii. The complete plastome sequence of the C. rosea was obtained using the next generation DNA sequencing technology. The genome is a circular structure of 133,122 bases and it lacks inverted repeat. It contains 111 unique genes, including 76 protein-coding, 30 tRNA, and four rRNA genes. Repeat analyses obtained 239, 244, 258, and 246 simple sequence repeats in C. rosea, C. microphylla, C. kozlowii, and C. korshinskii, respectively. Analyses of sequence divergence found two intergenic regions: trnI-CAU-ycf2 and trnN-GUU-ycf1, exhibiting a high degree of variations. Phylogenetic analyses showed that the four Caragana species belong to a monophyletic clade. Analyses of Ka/Ks ratios revealed that five genes: rpl16, rpl20, rps11, rps7, and ycf1 and several sites having undergone strong positive selection in the Caragana branch. The results lay the foundation for the development of molecular markers and the understanding of the evolutionary process for drought-resistant characteristics.

  10. Design of the storage location based on the ABC analyses

    Science.gov (United States)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  11. Sensitivity analyses of biodiesel thermo-physical properties under diesel engine conditions

    DEFF Research Database (Denmark)

    Cheng, Xinwei; Ng, Hoon Kiat; Gan, Suyin

    2016-01-01

    This reported work investigates the sensitivities of spray and soot developments to the change of thermo-physical properties for coconut and soybean methyl esters, using two-dimensional computational fluid dynamics fuel spray modelling. The choice of test fuels made was due to their contrasting...... saturation-unsaturation compositions. The sensitivity analyses for non-reacting and reacting sprays were carried out against a total of 12 thermo-physical properties, at an ambient temperature of 900 K and density of 22.8 kg/m3. For the sensitivity analyses, all the thermo-physical properties were set...... as the baseline case and each property was individually replaced by that of diesel. The significance of individual thermo-physical property was determined based on the deviations found in predictions such as liquid penetration, ignition delay period and peak soot concentration when compared to those of baseline...

  12. Analyses of integrated aircraft cabin contaminant monitoring network based on Kalman consensus filter.

    Science.gov (United States)

    Wang, Rui; Li, Yanxiao; Sun, Hui; Chen, Zengqiang

    2017-11-01

    The modern civil aircrafts use air ventilation pressurized cabins subject to the limited space. In order to monitor multiple contaminants and overcome the hypersensitivity of the single sensor, the paper constructs an output correction integrated sensor configuration using sensors with different measurement theories after comparing to other two different configurations. This proposed configuration works as a node in the contaminant distributed wireless sensor monitoring network. The corresponding measurement error models of integrated sensors are also proposed by using the Kalman consensus filter to estimate states and conduct data fusion in order to regulate the single sensor measurement results. The paper develops the sufficient proof of the Kalman consensus filter stability when considering the system and the observation noises and compares the mean estimation and the mean consensus errors between Kalman consensus filter and local Kalman filter. The numerical example analyses show the effectiveness of the algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Analysing Scientific Collaborations of New Zealand Institutions using Scopus Bibliometric Data

    OpenAIRE

    Aref, Samin; Friggens, David; Hendy, Shaun

    2017-01-01

    Scientific collaborations are among the main enablers of development in small national science systems. Although analysing scientific collaborations is a well-established subject in scientometrics, evaluations of scientific collaborations within a country remain speculative with studies based on a limited number of fields or using data too inadequate to be representative of collaborations at a national level. This study represents a unique view on the collaborative aspect of scientific activi...

  14. Sensitivity analyses of fast reactor systems including thorium and uranium

    International Nuclear Information System (INIS)

    Marable, J.H.; Weisbin, C.R.

    1978-01-01

    The Cross Section Evaluation Working Group (CSEWG) has, in conjunction with the development of the fifth version of ENDF/B, assembled new evaluations for 232 Th and 233 U. It is the purpose of this paper to describe briefly some of the more important features of these evaluations relative to ENDF/B-4 to project the change in reactor performance based upon the newer evaluated files and sensitivity coefficients for interesting design problems, and to indicate preliminary results from ongoing uncertainty analyses

  15. Analyses of liquid-gas two-phase flow in fermentation tanks

    International Nuclear Information System (INIS)

    Toi, Takashi; Serizawa, Akimi; Takahashi, Osamu; Kawara, Zensaku; Gofuku, Akio; Kataoka, Isao.

    1993-01-01

    The understanding of two-phase flow is one of the important problems for both design and safety analyses of various engineering systems. For example, the flow conditions in beer fermentation tanks have an influence on the quality of production and productivity of tank. In this study, a two-dimensional numerical calculation code based on the one-pressure two-fluid model is developed to understand the circulation structure of low quality liquid-gas two-phase flows induced by bubble plume in a tank. (author)

  16. X-ray optical analyses with X-Ray Absorption Package (XRAP)

    International Nuclear Information System (INIS)

    Wang, Zhibi; Kuzay, T.M.; Dejus, R.; Grace, T.

    1994-01-01

    This paper presents an X-Ray Absorption Package (XRAP) and the theoretical background for this program. XRAP is a computer code developed for analysis of optical elements in synchrotron radiation facilities. Two main issues are to be addressed: (1) generating BM (bending magnet) and ID (insertion device) spectrum and calculating their absorption in media, especially in such structural forms as variable thickness windows/filters and crystals; and (2) providing a finite difference engine for fast but sophisticated thermal and stress analyses for optical elements, such as windows and filters. Radiation cooling, temperature-dependent material properties (such as thermal conductivity and thermal expansion coefficient) etc. are taken into account in the analyses. For very complex geometry, an interface is provided directly to finite element codes such as ANSYS. Some of the present features built into XRAP include: (1) generation of BM and ID spectra; (2) photon absorption analysis of optical elements including filters, windows and mirrors, etc.; (3) heat transfer and thermal stress analyses of windows and filters and their buckling check; (4) user-friendly graphical-interface that is based on the state-of-the-art technology of GUI and X-window systems, which can be easily ported to other computer platforms; (5) postscript file output of either black/white or colored graphics for total/absorbed power, temperature, stress, spectra, etc

  17. Review of accident analyses of RB experimental reactor

    International Nuclear Information System (INIS)

    Pesic, M.

    2003-01-01

    The RB reactor is a uranium fuel heavy water moderated critical assembly that has been put and kept in operation by the VINCA Institute of Nuclear Sciences, Belgrade, Serbia and Montenegro, since April 1958. The first complete Safety Analysis Report of the RB reactor was prepared in 1961/62; yet, the first accident analysis had been made in late 1958 with the aim to examine a power transition and the total equivalent doses received by the staff during the reactivity accident that occurred on October 15, 1958. Since 1960, the RB reactor has been modified a few times. Beside the initial natural uranium metal fuel rods, new types of fuel (TVR-S types of Russian origin) consisting of 2% enriched uranium metal and 80% enriched U0 2 , dispersed in aluminum matrix, have been available since 1962 and 1976, respectively. Modifications of the control and safety systems of the reactor were made occasionally. Special reactor cores were designed and constructed using all three types of fuel elements, as well as the coupled fast-thermal ones. The Nuclear Safety Committee of the VINCA Institute, an independent regulator)' body, approved for usage all these modifications of the RB reactor on the basis of the Preliminary Safety' Analysis Reports, which, beside proposed technical modifications and new regulation rules, included safety analyses of various possible accidents. A special attention was given (and a new safety methodology was proposed) to thorough analyses of the design-based accidents related to the coupled fast-thermal cores that included central zones of the reactor filled by the fuel elements without any moderator. In this paper, an overview of some accidents, methodologies and computation tools used for the accident analyses of the RB reactor is given. (author)

  18. Some Examples of Accident Analyses for RB Reactor

    International Nuclear Information System (INIS)

    Pesic, M.

    2002-01-01

    The RB reactor is heavy water critical assembly operated in the Vinca Institute of Nuclear Sciences, Belgrade, Yugoslavia, since April 1959. The first Safety Analysis Report of the RB critical assembly was prepared in 1961/62. But, the first accidental analysis was done in late 1958 in aim the examine power transient and total equivalent doses received by the staff during the reactivity accident occurred on October 15, 1958. Since 1960, the RB reactor is modified few times. Beside initial natural uranium metal fuel rods, new fuel (TVR-S types) from 2% enriched metal uranium and 80% enriched UO 2 were available since 1962 and 1976, respectively. Also, modifications in control and safety systems of the reactor were done occasionally. Special reactor cores were created using all three types of fuel elements, among them, the coupled fast-thermal ones. Nuclear Safety Committee of the Vinca Institute, an independent regulatory body approved for usage all these modifications of the RB reactor. For those decisions of the Committee, the Preliminary Safety Analysis Reports were prepared that, beside proposed technical modifications and new regulation rules had included analyses of various possible accidents. Special attention is given and new methodology was proposed for thoroughly analyses of design based accidents related to coupled fast-thermal cores, that include reactor central zones filled by fuel elements without moderator. In these accidents, during assumed flooding of the fast zone by moderator, a very high reactivity could be inserted in the system with very high reactivity rate. It was necessary to provide that the safety system of the reactor had fast response to that accident and had enough high (negative) reactivity to shut down the reactor timely. In this paper, a brief overview of some accidents, methodology and computation tools used for the accident analyses at RB reactor are given. (author)

  19. Review of accident analyses of RB experimental reactor

    Directory of Open Access Journals (Sweden)

    Pešić Milan P.

    2003-01-01

    Full Text Available The RB reactor is a uranium fuel heavy water moderated critical assembly that has been put and kept in operation by the VTNCA Institute of Nuclear Sciences, Belgrade, Serbia and Montenegro, since April 1958. The first complete Safety Analysis Report of the RB reactor was prepared in 1961/62 yet, the first accident analysis had been made in late 1958 with the aim to examine a power transition and the total equivalent doses received by the staff during the reactivity accident that occurred on October 15, 1958. Since 1960, the RB reactor has been modified a few times. Beside the initial natural uranium metal fuel rods, new types of fuel (TVR-S types of Russian origin consisting of 2% enriched uranium metal and 80% enriched UO2 dispersed in aluminum matrix, have been available since 1962 and 1976 respectively. Modifications of the control and safety systems of the reactor were made occasionally. Special reactor cores were designed and constructed using all three types of fuel elements as well as the coupled fast-thermal ones. The Nuclear Safety Committee of the VINĆA Institute, an independent regulatory body, approved for usage all these modifications of the RB reactor on the basis of the Preliminary Safety Analysis Reports, which, beside proposed technical modifications and new regulation rules, included safety analyses of various possible accidents. A special attention was given (and a new safety methodology was proposed to thorough analyses of the design-based accidents related to the coupled fast-thermal cores that included central zones of the reactor filled by the fuel elements without any moderator. In this paper, an overview of some accidents, methodologies and computation tools used for the accident analyses of the RB reactor is given.

  20. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  1. Long-term fertilization alters chemically-separated soil organic carbon pools: Based on stable C isotope analyses

    Science.gov (United States)

    Dou, Xiaolin; He, Ping; Cheng, Xiaoli; Zhou, Wei

    2016-01-01

    Quantification of dynamics of soil organic carbon (SOC) pools under the influence of long-term fertilization is essential for predicting carbon (C) sequestration. We combined soil chemical fractionation with stable C isotope analyses to investigate the C dynamics of the various SOC pools after 25 years of fertilization. Five types of soil samples (0-20, 20-40 cm) including the initial level (CK) and four fertilization treatments (inorganic nitrogen fertilizer, IN; balanced inorganic fertilizer, NPK; inorganic fertilizer plus farmyard manure, MNPK; inorganic fertilizer plus corn straw residue, SNPK) were separated into recalcitrant and labile fractions, and the fractions were analysed for C content, C:N ratios, δ13C values, soil C and N recalcitrance indexes (RIC and RIN). Chemical fractionation showed long-term MNPK fertilization strongly increased the SOC storage in both soil layers (0-20 cm = 1492.4 gC m2 and 20-40 cm = 1770.6 gC m2) because of enhanced recalcitrant C (RC) and labile C (LC). The 25 years of inorganic fertilizer treatment did not increase the SOC storage mainly because of the offsetting effects of enhanced RC and decreased LC, whereas no clear SOC increases under the SNPK fertilization resulted from the fast decay rates of soil C.

  2. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  3. Distinguishing Nonpareil marketing group almond cultivars through multivariate analyses.

    Science.gov (United States)

    Ledbetter, Craig A; Sisterson, Mark S

    2013-09-01

    More than 80% of the world's almonds are grown in California with several dozen almond cultivars available commercially. To facilitate promotion and sale, almond cultivars are categorized into marketing groups based on kernel shape and appearance. Several marketing groups are recognized, with the Nonpareil Marketing Group (NMG) demanding the highest prices. Placement of cultivars into the NMG is historical and no objective standards exist for deciding whether newly developed cultivars belong in the NMG. Principal component analyses (PCA) were used to identify nut and kernel characteristics best separating the 4 NMG cultivars (Nonpareil, Jeffries, Kapareil, and Milow) from a representative of the California Marketing Group (cultivar Carmel) and the Mission Marketing Group (cultivar Padre). In addition, discriminant analyses were used to determine cultivar misclassification rates between and within the marketing groups. All 19 evaluated carpological characters differed significantly among the 6 cultivars and during 2 harvest seasons. A clear distinction of NMG cultivars from representatives of the California and Mission Marketing Groups was evident from a PCA involving the 6 cultivars. Further, NMG kernels were successfully discriminated from kernels representing the California and Mission Marketing Groups with overall kernel misclassification of only 2% using 16 of the 19 evaluated characters. Pellicle luminosity was the most discriminating character, regardless of the character set used in analyses. Results provide an objective classification of NMG almond kernels, clearly distinguishing them from kernels of cultivars representing the California and Mission Marketing Groups. Journal of Food Science © 2013 Institute of Food Technologists® No claim to original US government works.

  4. Analyses of the essential oil from Bunium persicum fruit and its antioxidant constituents.

    Science.gov (United States)

    Nickavar, Bahman; Adeli, Abrisham; Nickavar, Azar

    2014-01-01

    This study was aimed to analyze and identify the antioxidant constituents of the essential oil of Bunium persicum (Apiaceae) fruit. The essential oil was obtained by hydrodistillation and analyses by GC-FID and GC-MS. The essential oil was tested for antioxidant capacity in DPPH radical scavenging and linoleic acid/β-carotene assays. The TLC-bioautography method based on DPPH radical assay and GC analyses were carried out to characterize the major antioxidant compounds in the essential oil. GC analyses showed the presence of sixteen compounds with p-cymene (31.1%), cuminaldehyde (22.2%), and γ-terpinene (11.4%) as the main components in the essential oil. The oil exhibited good radical scavenging [IC50 (DPPH·) = 4.47 (3.96 - 5.05) mg/mL] and antilipid peroxidation [IC50 (β-carotene bleaching) = 0.22 (0.16 - 0.31) mg/mL] activities. The TLC tests resulted in identification of cuminaldehyde, p-cymene-7-ol, and cuminyl acetate as the main constituents of the active oil fraction.

  5. Making systems with mutually exclusive events analysable by standard fault tree analysis tools

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    2001-01-01

    Methods are developed for analysing systems that comprise mutually exclusive events by fault tree techniques that accept only statistically independent basic events. Techniques based on equivalent models and numerical transformations are presented for phased missions and for systems with component-caused system-level common cause failures. Numerical examples illustrate the methods

  6. Aroma profile of Garnacha Tintorera-based sweet wines by chromatographic and sensorial analyses.

    Science.gov (United States)

    Noguerol-Pato, R; González-Álvarez, M; González-Barreiro, C; Cancho-Grande, B; Simal-Gándara, J

    2012-10-15

    The aroma profiles obtained of three Garnacha Tintorera-based wines were studied: a base wine, a naturally sweet wine, and a mixture of naturally sweet wine with other sweet wine obtained by fortification with spirits. The aroma fingerprint was traced by GC-MS analysis of volatile compounds and by sensorial analysis of odours and tastes. Within the volatiles compounds, sotolon (73 μg/L) and acetoin (122 μg/L) were the two main compounds found in naturally sweet wine. With regards to the odorant series, those most dominant for Garnacha Tintorera base wine were floral, fruity and spicy. Instead, the most marked odorant series affected by off-vine drying of the grapes were floral, caramelized and vegetal-wood. Finally, odorant series affected by the switch-off of alcoholic fermentation with ethanol 96% (v/v) fit for human consumption followed by oak barrel aging were caramelized and vegetal-wood. A partial least square test (PLS-2) was used to detect correlations between sets of sensory data (those obtained with mouth and nose) with the ultimate aim of improving our current understanding of the flavour of Garnacha Tintorera red wines, both base and sweet. Based on the sensory dataset analysis, the descriptors with the highest weight for separating base and sweet wines from Garnacha Tintorera were sweetness, dried fruit and caramel (for sweet wines) vs. bitterness, astringency and geranium (for base wines). Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    Science.gov (United States)

    Hoľko, Michal; Stacho, Jakub

    2014-12-01

    The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.

  8. Omeups: an interactive graphics program for analysing collision data

    International Nuclear Information System (INIS)

    Burgess, A.; Mason, H.E.; Tully, J.A.

    1991-01-01

    The aim of the micro-computer program OMEUPS is to provide a simple means of critically assessing and compacting collision strength data for electron impact excitation of positive ions. The program is interactive and allows data to be analysed graphically: it should be of particular interest to astrophysicists as well as to those specialising in atomic physics. The method on which the program is based allows one to interpolate or extrapolate existing data in energy and temperature; store data in compact form without losing significant information; perform Maxwell averaging; detect printing and computational errors in tabulated data

  9. A dialogue game for analysing group model building: framing collaborative modelling and its facilitation

    NARCIS (Netherlands)

    Hoppenbrouwers, S.J.B.A.; Rouwette, E.A.J.A.

    2012-01-01

    This paper concerns a specific approach to analysing and structuring operational situations in collaborative modelling. Collaborative modelling is viewed here as 'the goal-driven creation and shaping of models that are based on the principles of rational description and reasoning'. Our long term

  10. Synthesis, Hirshfeld surface analyses and magnetism of a 1D Mn(II ...

    African Journals Online (AJOL)

    A new Mn-based complex of {[Mn(L)2(mi)]·H2O}n (1) (HL = p-hydroxy phenylacetic acid; mi = 1,1'-(1,4-butanediyl)bis(imidazole)), has been synthesized and structurally characterized. Single-crystal X-ray analyses reveal that compound 1 has a dinuclear Mn(II) unit linking by four carboxylate groups. The bridging N-donor ...

  11. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic – Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered. PMID:28122062

  12. Atmospheric radiation environment analyses based-on CCD camera at various mountain altitudes and underground sites

    Directory of Open Access Journals (Sweden)

    Li Cavoli Pierre

    2016-01-01

    Full Text Available The purpose of this paper is to discriminate secondary atmospheric particles and identify muons by measuring the natural radiative environment in atmospheric and underground locations. A CCD camera has been used as a cosmic ray sensor. The Low Noise Underground Laboratory of Rustrel (LSBB, France gives the access to a unique low-noise scientific environment deep enough to ensure the screening from the neutron and proton radiative components. Analyses of the charge levels in pixels of the CCD camera induced by radiation events and cartographies of the charge events versus the hit pixel are proposed.

  13. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  14. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  15. Register-based studies of healthcare costs

    DEFF Research Database (Denmark)

    Kruse, Marie; Christiansen, Terkel

    2011-01-01

    Introduction: The aim of this paper is to provide an overview and a few examples of how national registers are used in analyses of healthcare costs in Denmark. Research topics: The paper focuses on health economic analyses based on register data. For the sake of simplicity, the studies are divided...... into three main categories: economic evaluations of healthcare interventions, cost-of-illness analyses, and other analyses such as assessments of healthcare productivity. Conclusion: We examined a number of studies using register-based data on healthcare costs. Use of register-based data renders...

  16. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  17. Comparative Analysis of Upper Ocean Heat Content Variability from Ensemble Operational Ocean Analyses

    Science.gov (United States)

    Xue, Yan; Balmaseda, Magdalena A.; Boyer, Tim; Ferry, Nicolas; Good, Simon; Ishikawa, Ichiro; Rienecker, Michele; Rosati, Tony; Yin, Yonghong; Kumar, Arun

    2012-01-01

    Upper ocean heat content (HC) is one of the key indicators of climate variability on many time-scales extending from seasonal to interannual to long-term climate trends. For example, HC in the tropical Pacific provides information on thermocline anomalies that is critical for the longlead forecast skill of ENSO. Since HC variability is also associated with SST variability, a better understanding and monitoring of HC variability can help us understand and forecast SST variability associated with ENSO and other modes such as Indian Ocean Dipole (IOD), Pacific Decadal Oscillation (PDO), Tropical Atlantic Variability (TAV) and Atlantic Multidecadal Oscillation (AMO). An accurate ocean initialization of HC anomalies in coupled climate models could also contribute to skill in decadal climate prediction. Errors, and/or uncertainties, in the estimation of HC variability can be affected by many factors including uncertainties in surface forcings, ocean model biases, and deficiencies in data assimilation schemes. Changes in observing systems can also leave an imprint on the estimated variability. The availability of multiple operational ocean analyses (ORA) that are routinely produced by operational and research centers around the world provides an opportunity to assess uncertainties in HC analyses, to help identify gaps in observing systems as they impact the quality of ORAs and therefore climate model forecasts. A comparison of ORAs also gives an opportunity to identify deficiencies in data assimilation schemes, and can be used as a basis for development of real-time multi-model ensemble HC monitoring products. The OceanObs09 Conference called for an intercomparison of ORAs and use of ORAs for global ocean monitoring. As a follow up, we intercompared HC variations from ten ORAs -- two objective analyses based on in-situ data only and eight model analyses based on ocean data assimilation systems. The mean, annual cycle, interannual variability and longterm trend of HC have

  18. Two time-series analyses of the impact of antibiotic consumption and alcohol-based hand disinfection on the incidences of nosocomial methicillin-resistant Staphylococcus aureus infection and Clostridium difficile infection.

    Science.gov (United States)

    Kaier, Klaus; Hagist, Christian; Frank, Uwe; Conrad, Andreas; Meyer, Elisabeth

    2009-04-01

    To determine the impact of antibiotic consumption and alcohol-based hand disinfection on the incidences of nosocomial methicillin-resistant Staphylococcus aureus (MRSA) infection and Clostridium difficile infection (CDI). Two multivariate time-series analyses were performed that used as dependent variables the monthly incidences of nosocomial MRSA infection and CDI at the Freiburg University Medical Center during the period January 2003 through October 2007. The volume of alcohol-based hand rub solution used per month was quantified in liters per 1,000 patient-days. Antibiotic consumption was calculated in terms of the number of defined daily doses per 1,000 patient-days per month. The use of alcohol-based hand rub was found to have a significant impact on the incidence of nosocomial MRSA infection (Phand rub was associated with a lower incidence of nosocomial MRSA infection. Conversely, a higher level of consumption of selected antimicrobial agents was associated with a higher incidence of nosocomial MRSA infection. This analysis showed this relationship was the same for the use of second-generation cephalosporins (P= .023), third-generation cephalosporins (P= .05), fluoroquinolones (P= .01), and lincosamides (P= .05). The multivariate analysis (R2=0.55) showed that a higher level of consumption of third-generation cephalosporins (P= .008), fluoroquinolones (P= .084), and/or macrolides (P= .007) was associated with a higher incidence of CDI. A correlation with use of alcohol-based hand rub was not detected. In 2 multivariate time-series analyses, we were able to show the impact of hand hygiene and antibiotic use on the incidence of nosocomial MRSA infection, but we found no association between hand hygiene and incidence of CDI.

  19. [Noonan syndrome can be diagnosed clinically and through molecular genetic analyses].

    Science.gov (United States)

    Henningsen, Marie Krab; Jelsig, Anne Marie; Andersen, Helle; Brusgaard, Klaus; Ousager, Lilian Bomme; Hertz, Jens Michael

    2015-08-03

    Noonan syndrome is part of the group of RASopathies caused by germ line mutations in genes involved in the RAS/MAPK pathway. There is substantial phenotypic overlap among the RASopathies. Diagnosis of Noonan syndrome is often based on clinical features including dysmorphic facial features, short stature and congenital heart disease. Rapid advances in sequencing technology have made molecular genetic analyses a helpful tool in diagnosing and distinguishing Noonan syndrome from other RASopathies.

  20. Behavior of underclad cracks in reactor pressure vessels - evaluation of mechanical analyses with tests on cladded mock-ups

    International Nuclear Information System (INIS)

    Moinereau, D.; Rousselier, G.; Bethmont, M.

    1993-01-01

    Innocuity of underclad flaws in the reactor pressure vessels must be demonstrated in the French safety analyses, particularly in the case of a severe transient at the end of the pressure vessel lifetime, because of the radiation embrittlement of the vessel material. Safety analyses are usually performed with elastic and elasto-plastic analyses taking into account the effect of the stainless steel cladding. EDF has started a program including experiments on large size cladded specimens and their interpretations. The purpose of this program is to evaluate the different methods of fracture analysis used in safety studies. Several specimens made of ferritic steel A508 C1 3 with stainless steel cladding, containing small artificial defects, are loaded in four-point bending. Experiments are performed at very low temperature to simulate radiation embrittlement and to obtain crack instability by cleavage fracture. Three tests have been performed on mock-ups containing a small underclad crack (with depth about 5 mn) and a fourth test has been performed on one mock-up with a larger crack (depth about 13 mn). In each case, crack instability occurred by cleavage fracture in the base metal, without crack arrest, at a temperature of about - 170 deg C. Each test is interpreted using linear elastic analysis and elastic-plastic analysis by two-dimensional finite element computations. The fracture are conservatively predicted: the stress intensity factors deduced from the computations (K cp or K j ) are always greater than the base metal toughness. The comparison between the elastic analyses (including two plasticity corrections) and the elastic-plastic analyses shows that the elastic analyses are often conservative. The beneficial effect of the cladding in the analyses is also shown : the analyses are too conservative if the cladding effects is not taken into account. (authors). 9 figs., 6 tabs., 10 refs

  1. Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study

    NARCIS (Netherlands)

    Vale, C.L.; Rydzewska, L.H.; Rovers, M.M.; Emberson, J.R.; Gueyffier, F.; Stewart, L.A.

    2015-01-01

    OBJECTIVE: To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines. DESIGN: Descriptive study. SETTING: Database maintained by the Cochrane IPD Meta-analysis

  2. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  3. Energy and exergy analyses of an integrated solar heat pump system

    International Nuclear Information System (INIS)

    Suleman, F.; Dincer, I.; Agelin-Chaab, M.

    2014-01-01

    An integrated solar and heat pump based system for industrial heating is developed in this study. The system comprises heat pump cycle for process heating water and solar energy for another industrial heating process. Comprehensive energy and exergy analyses are performed on the system. These analyses generated some compelling results as expected because of the use of green and environmentally friendly energy sources. The results show that the energy efficiency of the process is 58% while the exergy efficiency is 75%. Energetic COP of the heat pump cycle is 3.54 whereas the exergy efficiency is 42.5%. Moreover, the energetic COP of the system is 2.97 and the exergy efficiency of the system is 35.7%. In the parametric study, a different variation such as changing the temperature and pressure of the condenser also shows positive results. - Highlights: • An integrated system is analysed using renewable energy source which can be used in textile industry. • Energy losses and exergy destructions are calculated at all major components. • Energy and exergy efficiencies of all subunits, subsystems and overall system are determined. • A parametric study shows the effect of environment and operating conditions on efficiencies. • Solar energy for heating in textile industry is efficient and environmentally friendly

  4. Ultrastructure of spermatozoa of spider crabs, family Mithracidae (Crustacea, Decapoda, Brachyura): Integrative analyses based on morphological and molecular data.

    Science.gov (United States)

    Assugeni, Camila de O; Magalhães, Tatiana; Bolaños, Juan A; Tudge, Christopher C; Mantelatto, Fernando L; Zara, Fernando J

    2017-12-01

    Recent studies based on morphological and molecular data provide a new perspective concerning taxonomic aspects of the brachyuran family Mithracidae. These studies proposed a series of nominal changes and indicated that the family is actually represented by a different number and representatives of genera than previously thought. Here, we provide a comparative description of the ultrastructure of spermatozoa and spermatophores of some species of Mithracidae in a phylogenetic context. The ultrastructure of the spermatozoa and spermatophore was observed by scanning and transmission electron microscopy. The most informative morphological characters analysed were thickness of the operculum, shape of the perforatorial chamber and shape and thickness of the inner acrosomal zone. As a framework, we used a topology based on a phylogenetic analysis using mitochondrial data obtained here and from previous studies. Our results indicate that closely related species share a series of morphological characteristics of the spermatozoa. A thick operculum, for example, is a feature observed in species of the genera Amphithrax, Teleophrys, and Omalacantha in contrast to the slender operculum observed in Mithraculus and Mithrax. Amphithrax and Teleophrys have a rhomboid perforatorial chamber, while Mithraculus, Mithrax, and Omalacantha show a wider, deltoid morphology. Furthermore, our results are in agreement with recently proposed taxonomic changes including the separation of the genera Mithrax (previously Damithrax), Amphithrax (previously Mithrax) and Mithraculus, and the synonymy of Mithrax caribbaeus with Mithrax hispidus. Overall, the spermiotaxonomy of these species of Mithracidae represent a novel set of data that corroborates the most recent taxonomic revision of the family and can be used in future taxonomic and phylogenetic studies within this family. © 2017 Wiley Periodicals, Inc.

  5. Elastic meson-nucleon partial wave scattering analyses

    International Nuclear Information System (INIS)

    Arndt, R.A.

    1986-01-01

    Comprehensive analyses of π-n elastic scattering data below 1100 MeV(Tlab), and K+p scattering below 3 GeV/c(Plab) are discussed. Also discussed is a package of computer programs and data bases (scattering data, and solution files) through which users can ''explore'' these interactions in great detail; this package is known by the acronym SAID (for Scattering Analysis Interactive Dialin) and is accessible on VAX backup tapes, or by dialin to the VPI computers. The π-n, and k+p interactions will be described as seen through the SAID programs. A procedure will be described for generating an interpolating array from any of the solutions encoded in SAID; this array can then be used through a fortran callable subroutine (supplied as part of SAID) to give excellent amplitude reconstructions over a broad kinematic range

  6. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    Fischer, U.; Chen, Y.; Pereslavtsev, P.; Simakov, S.P.; Tsige-Tamirat, H.; Loughlin, M.; Perel, R.L.; Petrizzi, L.; Tautges, T.J.; Wilson, P.P.H.

    2005-01-01

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  7. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  8. Analyses and optimization of Lee propagation model for LoRa 868 MHz network deployments in urban areas

    Directory of Open Access Journals (Sweden)

    Dobrilović Dalibor

    2017-01-01

    Full Text Available In the recent period, fast ICT expansion and rapid appearance of new technologies raised the importance of fast and accurate planning and deployment of emerging communication technologies, especially wireless ones. In this paper is analyzed possible usage of Lee propagation model for planning, design and management of networks based on LoRa 868MHz technology. LoRa is wireless technology which can be deployed in various Internet of Things and Smart City scenarios in urban areas. The analyses are based on comparison of field measurements with model calculations. Besides the analyses of Lee propagation model usability, the possible optimization of the model is discussed as well. The research results can be used for accurate design, planning and for preparation of high-performance wireless resource management of various Internet of Things and Smart City applications in urban areas based on LoRa or similar wireless technology. The equipment used for measurements is based on open-source hardware.

  9. Boron analyses in the reactor coolant system of French PWR by acid-base titration ([B]) and ICP-MS (10B atomic %): key to NPP safety

    International Nuclear Information System (INIS)

    Jouvet, Fabien; Roux, Sylvie; Carabasse, Stephanie; Felgines, Didier

    2012-09-01

    Boron is widely used by Nuclear Power Plants and especially by EDF Pressurized Water Reactors to ensure the control of the neutron rate in the reactor coolant system and, by this way, the fission reaction. The Boron analysis is thus a major factor of safety which enables operators to guarantee the permanent control of the reactor. Two kinds of analyses carried out by EDF on the Boron species, recently upgraded regarding new method validation standards and developed to enhance the measurement quality by reducing uncertainties, will be discussed in this topic: Acid-Base titration of Boron and Boron isotopic composition by Inductively Coupled Plasma Mass Spectrometer - ICP MS. (authors)

  10. The Japanese Society of Pathology Guidelines on the handling of pathological tissue samples for genomic research: Standard operating procedures based on empirical analyses.

    Science.gov (United States)

    Kanai, Yae; Nishihara, Hiroshi; Miyagi, Yohei; Tsuruyama, Tatsuhiro; Taguchi, Kenichi; Katoh, Hiroto; Takeuchi, Tomoyo; Gotoh, Masahiro; Kuramoto, Junko; Arai, Eri; Ojima, Hidenori; Shibuya, Ayako; Yoshida, Teruhiko; Akahane, Toshiaki; Kasajima, Rika; Morita, Kei-Ichi; Inazawa, Johji; Sasaki, Takeshi; Fukayama, Masashi; Oda, Yoshinao

    2018-02-01

    Genome research using appropriately collected pathological tissue samples is expected to yield breakthroughs in the development of biomarkers and identification of therapeutic targets for diseases such as cancers. In this connection, the Japanese Society of Pathology (JSP) has developed "The JSP Guidelines on the Handling of Pathological Tissue Samples for Genomic Research" based on an abundance of data from empirical analyses of tissue samples collected and stored under various conditions. Tissue samples should be collected from appropriate sites within surgically resected specimens, without disturbing the features on which pathological diagnosis is based, while avoiding bleeding or necrotic foci. They should be collected as soon as possible after resection: at the latest within about 3 h of storage at 4°C. Preferably, snap-frozen samples should be stored in liquid nitrogen (about -180°C) until use. When intending to use genomic DNA extracted from formalin-fixed paraffin-embedded tissue, 10% neutral buffered formalin should be used. Insufficient fixation and overfixation must both be avoided. We hope that pathologists, clinicians, clinical laboratory technicians and biobank operators will come to master the handling of pathological tissue samples based on the standard operating procedures in these Guidelines to yield results that will assist in the realization of genomic medicine. © 2018 The Authors. Pathology International published by Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.

  11. Importance of neutralization sieve analyses when seeking correlates of HIV-1 vaccine efficacy.

    Science.gov (United States)

    Montefiori, David C

    2014-01-01

    This commentary describes a rationale for the use of breakthrough viruses from clinical trial participants to assess neutralizing antibodies as a correlate of HIV-1 vaccine efficacy. The rationale is based on principles of a genetic sieve analysis, where the 2 analyses may be cooperative for delineating neutralizing antibodies as a mechanistic correlate of protection.

  12. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  13. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  14. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  15. VALUE-BASED MEDICINE AND OPHTHALMOLOGY: AN APPRAISAL OF COST-UTILITY ANALYSES

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Sharma, Sanjay; Brown, Heidi; Smithen, Lindsay; Leeser, David B; Beauchamp, George

    2004-01-01

    ABSTRACT Purpose To ascertain the extent to which ophthalmologic interventions have been evaluated in value-based medicine format. Methods Retrospective literature review. Papers in the healthcare literature utilizing cost-utility analysis were reviewed by researchers at the Center for Value-Based Medicine, Flourtown, Pennsylvania. A literature review of papers addressing the cost-utility analysis of ophthalmologic procedures in the United States over a 12-year period from 1992 to 2003 was undertaken using the National Library of Medicine and EMBASE databases. The cost-utility of ophthalmologic interventions in inflation-adjusted (real) year 2003 US dollars expended per quality-adjusted life-year ($/QALY) was ascertained in all instances. Results A total of 19 papers were found, including a total of 25 interventions. The median cost-utility of ophthalmologic interventions was $5,219/QALY, with a range from $746/QALY to $6.5 million/QALY. Conclusions The majority of ophthalmologic interventions are especially cost-effective by conventional standards. This is because of the substantial value that ophthalmologic interventions confer to patients with eye diseases for the resources expended. PMID:15747756

  16. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  17. Treatment of visceral leishmaniasis: model-based analyses on the spread of antimony-resistant L. donovani in Bihar, India.

    Directory of Open Access Journals (Sweden)

    Anette Stauch

    Full Text Available BACKGROUND: Pentavalent antimonials have been the mainstay of antileishmanial therapy for decades, but increasing failure rates under antimonial treatment have challenged further use of these drugs in the Indian subcontinent. Experimental evidence has suggested that parasites which are resistant against antimonials have superior survival skills than sensitive ones even in the absence of antimonial treatment. METHODS AND FINDINGS: We use simulation studies based on a mathematical L. donovani transmission model to identify parameters which can explain why treatment failure rates under antimonial treatment increased up to 65% in Bihar between 1980 and 1997. Model analyses suggest that resistance to treatment alone cannot explain the observed treatment failure rates. We explore two hypotheses referring to an increased fitness of antimony-resistant parasites: the additional fitness is (i disease-related, by causing more clinical cases (higher pathogenicity or more severe disease (higher virulence, or (ii is transmission-related, by increasing the transmissibility from sand flies to humans or vice versa. CONCLUSIONS: Both hypotheses can potentially explain the Bihar observations. However, increased transmissibility as an explanation appears more plausible because it can occur in the background of asymptomatically transmitted infection whereas disease-related factors would most probably be observable. Irrespective of the cause of fitness, parasites with a higher fitness will finally replace sensitive parasites, even if antimonials are replaced by another drug.

  18. Lagrangian Coherent Structure Analysis of Terminal Winds: Three-Dimensionality, Intramodel Variations, and Flight Analyses

    Directory of Open Access Journals (Sweden)

    Brent Knutson

    2015-01-01

    Full Text Available We present a study of three-dimensional Lagrangian coherent structures (LCS near the Hong Kong International Airport and relate to previous developments of two-dimensional (2D LCS analyses. The LCS are contrasted among three independent models and against 2D coherent Doppler light detection and ranging (LIDAR data. Addition of the velocity information perpendicular to the LIDAR scanning cone helps solidify flow structures inferred from previous studies; contrast among models reveals the intramodel variability; and comparison with flight data evaluates the performance among models in terms of Lagrangian analyses. We find that, while the three models and the LIDAR do recover similar features of the windshear experienced by a landing aircraft (along the landing trajectory, their Lagrangian signatures over the entire domain are quite different—a portion of each numerical model captures certain features resembling those LCS extracted from independent 2D LIDAR analyses based on observations.

  19. Relap5/Mod2.5 analyses of SG primary collector head rupture in WWER-440 reactor

    International Nuclear Information System (INIS)

    Szczurek, J.

    1995-01-01

    The paper presents the results of the analyses of steam generator (SG) manifold cover rupture performed with RELAP5/MOD2.5 (version provided by RMA, Albuquerque, for PC PPS). The calculations presented are based on RELAP5 input deck for WWER-440/213 Bobunice NPP, developed within the framework of IAEA TC Project RER/9/004. The presented analyses are directed toward determining the maximum amount of reactor coolant discharged into the secondary coolant system and the maximum amount of contaminated coolant release to the atmosphere. In all cases considered in the analysis, maximum ECCS injection capacity is assumed. The paper includes only the cases without any operator actions within the time period covered by the analyses. In particular, the primary loop isolation valves are not used for isolating the broken steam generator. Two scenarios are analysed: with and without the SG safety valve stuck open

  20. Relap5/Mod2.5 analyses of SG primary collector head rupture in WWER-440 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Szczurek, J. [Inst. of Atomic Energy, Swierk (Poland)

    1995-12-31

    The paper presents the results of the analyses of steam generator (SG) manifold cover rupture performed with RELAP5/MOD2.5 (version provided by RMA, Albuquerque, for PC PPS). The calculations presented are based on RELAP5 input deck for WWER-440/213 Bobunice NPP, developed within the framework of IAEA TC Project RER/9/004. The presented analyses are directed toward determining the maximum amount of reactor coolant discharged into the secondary coolant system and the maximum amount of contaminated coolant release to the atmosphere. In all cases considered in the analysis, maximum ECCS injection capacity is assumed. The paper includes only the cases without any operator actions within the time period covered by the analyses. In particular, the primary loop isolation valves are not used for isolating the broken steam generator. Two scenarios are analysed: with and without the SG safety valve stuck open. 3 refs.

  1. Relap5/Mod2.5 analyses of SG primary collector head rupture in WWER-440 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Szczurek, J [Inst. of Atomic Energy, Swierk (Poland)

    1996-12-31

    The paper presents the results of the analyses of steam generator (SG) manifold cover rupture performed with RELAP5/MOD2.5 (version provided by RMA, Albuquerque, for PC PPS). The calculations presented are based on RELAP5 input deck for WWER-440/213 Bobunice NPP, developed within the framework of IAEA TC Project RER/9/004. The presented analyses are directed toward determining the maximum amount of reactor coolant discharged into the secondary coolant system and the maximum amount of contaminated coolant release to the atmosphere. In all cases considered in the analysis, maximum ECCS injection capacity is assumed. The paper includes only the cases without any operator actions within the time period covered by the analyses. In particular, the primary loop isolation valves are not used for isolating the broken steam generator. Two scenarios are analysed: with and without the SG safety valve stuck open. 3 refs.

  2. Use of flow models to analyse loss of coolant accidents

    International Nuclear Information System (INIS)

    Pinet, Bernard

    1978-01-01

    This article summarises current work on developing the use of flow models to analyse loss-of-coolant accident in pressurized-water plants. This work is being done jointly, in the context of the LOCA Technical Committee, by the CEA, EDF and FRAMATOME. The construction of the flow model is very closely based on some theoretical studies of the two-fluid model. The laws of transfer at the interface and at the wall are tested experimentally. The representativity of the model then has to be checked in experiments involving several elementary physical phenomena [fr

  3. One- and two-dimensional heating analyses of fusion synfuel blankets

    International Nuclear Information System (INIS)

    Tsang, J.S.K.; Lazareth, O.W.; Powell, J.R.

    1979-01-01

    Comparisons between one- and two-dimensional neutronics and heating analyses were performed on a Brookhaven designed fusion reactor blanket featuring synthetic fuel production. In this two temperature region blanket design, the structural shell is stainless steel. The interior of the module is a packed ball of high temperature ceramic material. The low temperature shell and the high temperature ceramic interior are separately cooled. Process steam (approx. 1500 0 C) is then produced in the ceramic core for the producion of H 2 and H 2 -based synthetic fuels by a high temperature electrolysis (HTE) process

  4. Barriers to guideline-compliant psoriasis care: analyses and concepts.

    Science.gov (United States)

    Eissing, L; Radtke, M A; Zander, N; Augustin, M

    2016-04-01

    Despite the availability of effective therapeutics and evidence-based treatment guidelines, a substantial proportion of patients with moderate-to-severe psoriasis does not receive appropriate care. This under-provision of health care may cause further worsening of health, remarkable limitations of the patient's quality of life, and indirect costs for the health care system. In order to provide guideline-compliant care for every psoriasis patient, it is important to identify barriers obstructing optimal care. Studies have identified various barriers on the physician's and on the patient's side; however, respective studies approached only single barriers, and not all of them in the context of psoriasis. Other publications that describe barriers systematically did not focus on psoriasis either. The objective of this literature review was to identify barriers and facilitators, based on studies analysing quality of care and single barriers, resulting in a comprehensive model of causal factors. Our analyses revealed three categories of barriers - patient-related, physician-related and external factors: On the patient side, we found non-adherence to therapies to be an important barrier, often in close association with psychiatric factors. Barriers on the physician's side predominantly are incomplete knowledge of the guidelines as well as the complexity of psoriasis comorbidity. In some countries, payment for patients with complex disease status is poor and inconsistent reimbursement regulations potentially interfere with optimal care. The current analysis indicates that most barriers are interdependent. Thus, measures approaching related barriers simultaneously are required. To improve care for psoriasis patients, further studies systematically addressing all potentially relevant barriers in conjoint are needed. © 2015 European Academy of Dermatology and Venereology.

  5. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Conserved regulators of nucleolar size revealed by global phenotypic analyses.

    Science.gov (United States)

    Neumüller, Ralph A; Gross, Thomas; Samsonova, Anastasia A; Vinayagam, Arunachalam; Buckner, Michael; Founk, Karen; Hu, Yanhui; Sharifpoor, Sara; Rosebrock, Adam P; Andrews, Brenda; Winston, Fred; Perrimon, Norbert

    2013-08-20

    Regulation of cell growth is a fundamental process in development and disease that integrates a vast array of extra- and intracellular information. A central player in this process is RNA polymerase I (Pol I), which transcribes ribosomal RNA (rRNA) genes in the nucleolus. Rapidly growing cancer cells are characterized by increased Pol I-mediated transcription and, consequently, nucleolar hypertrophy. To map the genetic network underlying the regulation of nucleolar size and of Pol I-mediated transcription, we performed comparative, genome-wide loss-of-function analyses of nucleolar size in Saccharomyces cerevisiae and Drosophila melanogaster coupled with mass spectrometry-based analyses of the ribosomal DNA (rDNA) promoter. With this approach, we identified a set of conserved and nonconserved molecular complexes that control nucleolar size. Furthermore, we characterized a direct role of the histone information regulator (HIR) complex in repressing rRNA transcription in yeast. Our study provides a full-genome, cross-species analysis of a nuclear subcompartment and shows that this approach can identify conserved molecular modules.

  7. Conserved Regulators of Nucleolar Size Revealed by Global Phenotypic Analyses

    Science.gov (United States)

    Neumüller, Ralph A.; Gross, Thomas; Samsonova, Anastasia A.; Vinayagam, Arunachalam; Buckner, Michael; Founk, Karen; Hu, Yanhui; Sharifpoor, Sara; Rosebrock, Adam P.; Andrews, Brenda; Winston, Fred; Perrimon, Norbert

    2014-01-01

    Regulation of cell growth is a fundamental process in development and disease that integrates a vast array of extra- and intracellular information. A central player in this process is RNA polymerase I (Pol I), which transcribes ribosomal RNA (rRNA) genes in the nucleolus. Rapidly growing cancer cells are characterized by increased Pol I–mediated transcription and, consequently, nucleolar hypertrophy. To map the genetic network underlying the regulation of nucleolar size and of Pol I–mediated transcription, we performed comparative, genome-wide loss-of-function analyses of nucleolar size in Saccharomyces cerevisiae and Drosophila melanogaster coupled with mass spectrometry–based analyses of the ribosomal DNA (rDNA) promoter. With this approach, we identified a set of conserved and nonconserved molecular complexes that control nucleolar size. Furthermore, we characterized a direct role of the histone information regulator (HIR) complex in repressing rRNA transcription in yeast. Our study provides a full-genome, cross-species analysis of a nuclear subcompartment and shows that this approach can identify conserved molecular modules. PMID:23962978

  8. Averaging Gone Wrong: Using Time-Aware Analyses to Better Understand Behavior

    OpenAIRE

    Barbosa, Samuel; Cosley, Dan; Sharma, Amit; Cesar-Jr, Roberto M.

    2016-01-01

    Online communities provide a fertile ground for analyzing people's behavior and improving our understanding of social processes. Because both people and communities change over time, we argue that analyses of these communities that take time into account will lead to deeper and more accurate results. Using Reddit as an example, we study the evolution of users based on comment and submission data from 2007 to 2014. Even using one of the simplest temporal differences between users---yearly coho...

  9. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    Science.gov (United States)

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  10. Entropy resistance analyses of a two-stream parallel flow heat exchanger with viscous heating

    International Nuclear Information System (INIS)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Heat exchangers are widely used in industry, and analyses and optimizations of the performance of heat exchangers are important topics. In this paper, we define the concept of entropy resistance based on the entropy generation analyses of a one-dimensional heat transfer process. With this concept, a two-stream parallel flow heat exchanger with viscous heating is analyzed and discussed. It is found that the minimization of entropy resistance always leads to the maximum heat transfer rate for the discussed two-stream parallel flow heat exchanger, while the minimizations of entropy generation rate, entropy generation numbers, and revised entropy generation number do not always. (general)

  11. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  12. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  13. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  14. A portable analyser for the measurement of ammonium in marine waters.

    Science.gov (United States)

    Amornthammarong, Natchanon; Zhang, Jia-Zhong; Ortner, Peter B; Stamates, Jack; Shoemaker, Michael; Kindel, Michael W

    2013-03-01

    A portable ammonium analyser was developed and used to measure in situ ammonium in the marine environment. The analyser incorporates an improved LED photodiode-based fluorescence detector (LPFD). This system is more sensitive and considerably smaller than previous systems and incorporates a pre-filtering subsystem enabling measurements in turbid, sediment-laden waters. Over the typical range for ammonium in marine waters (0–10 mM), the response is linear (r(2) = 0.9930) with a limit of detection (S/N ratio > 3) of 10 nM. The working range for marine waters is 0.05–10 mM. Repeatability is 0.3% (n =10) at an ammonium level of 2 mM. Results from automated operation in 15 min cycles over 16 days had good overall precision (RSD = 3%, n = 660). The system was field tested at three shallow South Florida sites. Diurnal cycles and possibly a tidal influence were expressed in the concentration variability observed.

  15. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les developpements angulaires. Enfin on fournit un substitut a la

  16. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  17. Protocols for 16S rDNA Array Analyses of Microbial Communities by Sequence-Specific Labeling of DNA Probes

    Directory of Open Access Journals (Sweden)

    Knut Rudi

    2003-01-01

    Full Text Available Analyses of complex microbial communities are becoming increasingly important. Bottlenecks in these analyses, however, are the tools to actually describe the biodiversity. Novel protocols for DNA array-based analyses of microbial communities are presented. In these protocols, the specificity obtained by sequence-specific labeling of DNA probes is combined with the possibility of detecting several different probes simultaneously by DNA array hybridization. The gene encoding 16S ribosomal RNA was chosen as the target in these analyses. This gene contains both universally conserved regions and regions with relatively high variability. The universally conserved regions are used for PCR amplification primers, while the variable regions are used for the specific probes. Protocols are presented for DNA purification, probe construction, probe labeling, and DNA array hybridizations.

  18. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms.

    Science.gov (United States)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette; Flachs, Esben Meulengracht; Hasselbalch, Hans Carl; Andersen, Christen Lykkegaard

    2016-04-01

    An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream statistical processing. The aim of this study was to investigate, whether patients with myeloproliferative neoplasms (MPNs) were willing and able to use the tool and fill out questionnaires regularly. Participants were recruited from the outpatient clinic at the Department of Haematology, Roskilde University Hospital, Denmark. Validated questionnaires that were used were European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-Core 30, Myeloproliferative Neoplasm Symptom Assessment Form, Brief Fatigue Inventory and Short Form 36 Health Survey. Questionnaires were filled out ≥ 6 months online or on paper according to participant preference. Regularity of questionnaire submission was investigated, and participant acceptance was evaluated by focus-group interviews. Of 135 invited patients, 118 (87 %) accepted participation. One hundred and seven participants (91 %) preferred to use the Internet-based tool. Of the 118 enrolled participants, 104 (88 %) submitted PROs regularly ≥ 6 months. The focus-group interviews revealed that the Internet-based tool was well accepted. The Internet-based approach and regular collection of PROs are well accepted with a high participation rate, persistency and adherence in a population of MPN patients. The plasticity of the platform allows for adaptation to patients with other medical conditions.

  19. Selection, rejection and optimisation of pyrolytic graphite (PG) crystal analysers for use on the new IRIS graphite analyser bank

    International Nuclear Information System (INIS)

    Marshall, P.J.; Sivia, D.S.; Adams, M.A.; Telling, M.T.F.

    2000-01-01

    This report discusses design problems incurred by equipping the IRIS high-resolution inelastic spectrometer at the ISIS pulsed neutron source, UK with a new 4212 piece pyrolytic graphite crystal analyser array. Of the 4212 graphite pieces required, approximately 2500 will be newly purchased PG crystals with the remainder comprising of the currently installed graphite analysers. The quality of the new analyser pieces, with respect to manufacturing specifications, is assessed, as is the optimum arrangement of new PG pieces amongst old to circumvent degradation of the spectrometer's current angular resolution. Techniques employed to achieve these criteria include accurate calliper measurements, FORTRAN programming and statistical analysis. (author)

  20. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  1. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  2. Identification of provenance rocks based on EPMA analyses of heavy minerals

    Science.gov (United States)

    Shimizu, M.; Sano, N.; Ueki, T.; Yonaga, Y.; Yasue, K. I.; Masakazu, N.

    2017-12-01

    Information on mountain building is significant in the field of geological disposal of high-level radioactive waste, because this affects long-term stability in groundwater flow system. Provenance analysis is one of effective approaches for understanding building process of mountains. Chemical compositions of heavy minerals, as well as their chronological data, can be an index for identification of provenance rocks. The accurate identification requires the measurement of as many grains as possible. In order to achieve an efficient provenance analysis, we developed a method for quick identification of heavy minerals using an Electron Probe Micro Analyzer (EPMA). In this method, heavy mineral grains extracted from a sample were aligned on a glass slide and mounted in a resin. Concentration of 28 elements was measured for 300-500 grains per sample using EPMA. To measure as many grains as possible, we prioritized swiftness of measurement over precision, configuring measurement time of about 3.5 minutes for each grain. Identification of heavy minerals was based on their chemical composition. We developed a Microsoft® Excel® spread sheet input criteria of mineral identification using a typical range of chemical compositions for each mineral. The grains of 110 wt.% total were rejected. The criteria of mineral identification were revised through the comparison between mineral identification by optical microscopy and chemical compositions of grains classified as "unknown minerals". Provenance rocks can be identified based on abundance ratio of identified minerals. If no significant difference of the abundance ratio was found among source rocks, chemical composition of specific minerals was used as another index. This method was applied to the sediments of some regions in Japan where provenance rocks had lithological variations but similar formation ages. Consequently, the provenance rocks were identified based on chemical compositions of heavy minerals resistant to

  3. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  4. Advanced handbook for accident analyses of German nuclear power plants; Weiterentwicklung eines Handbuches fuer Stoerfallanalysen deutscher Kernkraftwerke

    Energy Technology Data Exchange (ETDEWEB)

    Kerner, Alexander; Broecker, Annette; Hartung, Juergen; Mayer, Gerhard; Pallas Moner, Guim

    2014-09-15

    The advanced handbook of safety analyses (HSA) comprises a comprehensive electronic collection of knowledge for the compilation and conduction of safety analyses in the area of reactor, plant and containment behaviour as well as results of existing safety analyses (performed by GRS in the past) with characteristic specifications and further background information. In addition, know-how from the analysis software development and validation process is presented and relevant rules and regulations with regard to safety demonstration are provided. The HSA comprehensively covers the topic thermo-hydraulic safety analyses (except natural hazards, man-made hazards and malicious acts) for German pressurized and boiling water reactors for power and non-power operational states. In principle, the structure of the HSA-content represents the analytical approach utilized by safety analyses and applying the knowledge from safety analyses to technical support services. On the basis of a multilevel preparation of information to the topics ''compilation of safety analyses'', ''compilation of data bases'', ''assessment of safety analyses'', ''performed safety analyses'', ''rules and regulation'' and ''ATHLET-validation'' the HSA addresses users with different background, allowing them to enter the HSA at different levels. Moreover, the HSA serves as a reference book, which is designed future-oriented, freely configurable related to the content, completely integrated into the GRS internal portal and prepared to be used by a growing user group.

  5. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. © The Author(s) 2016.

  6. 14C-analyses of calcite coatings in open fractures from the Klipperaas study site, Southern Sweden

    International Nuclear Information System (INIS)

    Possnert, G.; Tullborg, E.L.

    1989-11-01

    Carbonate samples from open fractures in crystalline rock from the Klipperaas study site have been analysed for their 14 C contents using accelerator mass spectrometry. This technique makes it possible to analyse very small carbonate samples (c. 1 mg C). The analyses show low but varying contents of 14 C. However, contamination by CO 2 have taken place affecting small samples more than others. Attempts have been made to quantify the contamination and thus evaluate the analyses of the fracture samples. The obtained low 14 C values can be due to: 1. An effective retention of 14 C by sorption/fractionation forcing 14 C onto the calcite surfaces in the near-surface zone which means that the 14 C contribution to the deeper levels is diminished or 2. the penetration depth of surface groundwater is very shallow. The former is suggested as more probable based on evaluations of the hydrochemical conditions and the fracture mineral studies. (10 figs., 3 tabs., 9 refs.) (authors)

  7. On the use of uncertainty analyses to test hypotheses regarding deterministic model predictions of environmental processes

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bittner, E.A.; Essington, E.H.

    1995-01-01

    This paper illustrates the use of Monte Carlo parameter uncertainty and sensitivity analyses to test hypotheses regarding predictions of deterministic models of environmental transport, dose, risk and other phenomena. The methodology is illustrated by testing whether 238 Pu is transferred more readily than 239+240 Pu from the gastrointestinal (GI) tract of cattle to their tissues (muscle, liver and blood). This illustration is based on a study wherein beef-cattle grazed for up to 1064 days on a fenced plutonium (Pu)-contaminated arid site in Area 13 near the Nevada Test Site in the United States. Periodically, cattle were sacrificed and their tissues analyzed for Pu and other radionuclides. Conditional sensitivity analyses of the model predictions were also conducted. These analyses indicated that Pu cattle tissue concentrations had the largest impact of any model parameter on the pdf of predicted Pu fractional transfers. Issues that arise in conducting uncertainty and sensitivity analyses of deterministic models are discussed. (author)

  8. Treatment algorithm based on the multivariate survival analyses in patients with advanced hepatocellular carcinoma treated with trans-arterial chemoembolization.

    Directory of Open Access Journals (Sweden)

    Hasmukh J Prajapati

    Full Text Available To develop the treatment algorithm from multivariate survival analyses (MVA in patients with Barcelona clinic liver cancer (BCLC C (advanced Hepatocellular carcinoma (HCC patients treated with Trans-arterial Chemoembolization (TACE.Consecutive unresectable and non-tranplantable patients with advanced HCC, who received DEB TACE were studied. A total of 238 patients (mean age, 62.4yrs was included in the study. Survivals were analyzed according to different parameters from the time of the 1st DEB TACE. Kaplan Meier and Cox Proportional Hazard model were used for survival analysis. The SS was constructed from MVA and named BCLC C HCC Prognostic (BCHP staging system (SS.Overall median survival (OS was 16.2 months. In HCC patients with venous thrombosis (VT of large vein [main portal vein (PV, right or left PV, hepatic vein, inferior vena cava] (22.7% versus small vein (segmental/subsegmental PV (9.7% versus no VT had OSs of 6.4 months versus 20 months versus 22.8 months respectively (p<0.001. On MVA, the significant independent prognostic factors (PFs of survival were CP class, eastern cooperative oncology group (ECOG performance status (PS, single HCC<5 cm, site of VT, metastases, serum creatinine and serum alpha-feto protein. Based on these PFs, the BCHP staging system was constructed. The OSs of stages I, II and III were 28.4 months, 11.8 months and 2.4 months accordingly (p<0.001. The treatment plan was proposed according to the different stages.On MVA of patients with advanced HCC treated with TACE, significant independent prognostic factors (PFs of survival were CP class, ECOG PS, single HCC<5 cm or others, site of VT, metastases, serum creatinine and serum alpha-feto protein. New BCHP SS was proposed based on MVA data to identify the suitable advanced HCC patients for TACE treatments.

  9. Auto-ignition generated combustion. Pt. 2. Thermodynamic fundamentals; Verbrennungssteuerung durch Selbstzuendung. T. 2. Experimentelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Guibert, P. [Paris-6 Univ. (France). Lab. de Mecanique Physique; Morin, C. [Paris-6 Univ. (France); Mokhtari, S.

    2004-02-01

    The combustion initiation by auto-ignition demonstrates benefits in NO{sub x} reduction and in process stability for both spark-ignited and compression ignited engines. Based on the better thermodynamic particularities of the auto-ignition, which have been presented in the first part, the characteristics of this process are demonstrated in the second part by experimental analysis. For comparison with similar studies, the analyses have been carried out in base of a two stroke loop scavenged spark-ignition single cylinder engine. (orig.) [German] Die Steuerung der Verbrennung durch Selbstzuendung zeigt Vorteile bezueglich Senkung der NO{sub x}-Emission und Prozessstabilitaet, sowohl bei Otto- als auch bei Dieselmotoren. Auf Grundlage der thermodynamischen Besonderheiten der Selbstzuendvorgaenge, die im ersten Teil praesentiert wurden, erfolgt im zweiten Teil eine experimentelle Betrachtung der Prozesscharakteristika. Zur Vergleichbarkeit mit aehnlichen Untersuchungen wird die experimentelle Analyse auf Basis eines Zweitakt-Einzylinder-Ottomotors mit Umkehrspuelung durchgefuehrt. (orig.)

  10. Trend analyses of transformer problems in the U.S. nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2009-01-01

    Up to 2007, the authors have conducted the trend analyses of trouble events related to main generators, emergency diesel generators, breakers and motors, which are more likely to cause problems than other electric equipments in nuclear power plants. The frequency of trouble events in transformers in domestic nuclear power plants at present is approximately one third of the publicly reported cases in the U.S. However, as the situation of maintenance in Japan in the future will become similar to those in the U.S. if the operating period is extended or the maintenance method is to be shifted from preventive maintenance to condition based maintenance, there is a concern that the frequency of transformer events in Japan will increase in Japan, also. Thus, trend analyses were conducted on transformers events which had not been subject to such analyses, from among electrical equipments which are likely to cause problems. The trend analyses were performed on 23 transformer events which had occurred in the U.S. nuclear power plants in five years from 2003 through 2007 among events reported in the Licensee Event Reports (LERs: event reports submitted to NRC by U.S. nuclear power plants) which have been registered in the nuclear information database of the Institute of Nuclear Safety System, Incorporated's (INSS), as well as 8 events registered in the Nuclear Information Archives (NUCIA), which had occurred in domestic nuclear power plants in five from 2003 through 2007. Lessons learned from the trend analyses of the transformer trouble events in the U.S. revealed that for transformers in general, the maintenance management of tap changers is important, while for the main transformers which are most likely to cause problems, it is vital to prevent the deterioration of insulation and insulating oil. (author)

  11. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  12. Genes with minimal phylogenetic information are problematic for coalescent analyses when gene tree estimation is biased.

    Science.gov (United States)

    Xi, Zhenxiang; Liu, Liang; Davis, Charles C

    2015-11-01

    The development and application of coalescent methods are undergoing rapid changes. One little explored area that bears on the application of gene-tree-based coalescent methods to species tree estimation is gene informativeness. Here, we investigate the accuracy of these coalescent methods when genes have minimal phylogenetic information, including the implementation of the multilocus bootstrap approach. Using simulated DNA sequences, we demonstrate that genes with minimal phylogenetic information can produce unreliable gene trees (i.e., high error in gene tree estimation), which may in turn reduce the accuracy of species tree estimation using gene-tree-based coalescent methods. We demonstrate that this problem can be alleviated by sampling more genes, as is commonly done in large-scale phylogenomic analyses. This applies even when these genes are minimally informative. If gene tree estimation is biased, however, gene-tree-based coalescent analyses will produce inconsistent results, which cannot be remedied by increasing the number of genes. In this case, it is not the gene-tree-based coalescent methods that are flawed, but rather the input data (i.e., estimated gene trees). Along these lines, the commonly used program PhyML has a tendency to infer one particular bifurcating topology even though it is best represented as a polytomy. We additionally corroborate these findings by analyzing the 183-locus mammal data set assembled by McCormack et al. (2012) using ultra-conserved elements (UCEs) and flanking DNA. Lastly, we demonstrate that when employing the multilocus bootstrap approach on this 183-locus data set, there is no strong conflict between species trees estimated from concatenation and gene-tree-based coalescent analyses, as has been previously suggested by Gatesy and Springer (2014). Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  14. 'Gatekeepers' of Islamic financial circuits: analysing urban geographies of the global Shari'a elite

    OpenAIRE

    Bassens, David; Derudder, Ben; Witlox, Frank

    2012-01-01

    This paper analyses the importance of 'Shari'a scholars' in the Islamic Financial Services (IFS) sector, which has been a growing global practice since the 1970s. Based on Shari'a Law, IFS firms provide banking, finance and insurance respecting faith-based prohibitions on interest, speculation and risk taking. Although IFS firms operate across a variety of scales and involve a range of actors, this paper focuses on the transnational capacities of Shari'a experts employed by IFS firms. These s...

  15. Analyse of the international recommendations on the calculation of absorbed dose in the biota

    International Nuclear Information System (INIS)

    Pereira, Wagner de S.; Py Junior, Delcy de A.; Universidade Federal Fluminense; Kelecom, Alphonse

    2011-01-01

    This paper evaluates the recommendations of ICRP which has as objective the environmental radioprotection. It was analysed the recommendations 26, 60, 91, 103 and 108 of the ICRP. The ICRP-103 defined the concept of animal and plant of reference (APR) to be used in the RAP based on the calculation of absorbed dose based on APR concept. This last view allows to build a legal framework of environmental protection with a etic, moral and scientific visualization, more defensible than the anthropomorphic concept

  16. A non-equilibrium neutral model for analysing cultural change.

    Science.gov (United States)

    Kandler, Anne; Shennan, Stephen

    2013-08-07

    Neutral evolution is a frequently used model to analyse changes in frequencies of cultural variants over time. Variants are chosen to be copied according to their relative frequency and new variants are introduced by a process of random mutation. Here we present a non-equilibrium neutral model which accounts for temporally varying population sizes and mutation rates and makes it possible to analyse the cultural system under consideration at any point in time. This framework gives an indication whether observed changes in the frequency distributions of a set of cultural variants between two time points are consistent with the random copying hypothesis. We find that the likelihood of the existence of the observed assemblage at the end of the considered time period (expressed by the probability of the observed number of cultural variants present in the population during the whole period under neutral evolution) is a powerful indicator of departures from neutrality. Further, we study the effects of frequency-dependent selection on the evolutionary trajectories and present a case study of change in the decoration of pottery in early Neolithic Central Europe. Based on the framework developed we show that neutral evolution is not an adequate description of the observed changes in frequency. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Strength analyses of containment steel liner at the plasticity instability

    International Nuclear Information System (INIS)

    Klyashchitskij, V.I.; Golyakov, V.I.; Kostylev, V.I.; Margolin, B.Z.

    2003-01-01

    The steel liner of NPP containment plays the important role of a leak-tight contour preventing the possible releases of radioactive substances beyond the boundaries of the reactor building. However, so far in many cases an assessment of strain-stress state of the liner having initial imperfections of the shape was made with approximate methods. A new methodology for the analysis of the liner at the plasticity instability was developed at Atomenergoproekt institute in cooperation with specialists from other agencies. The methodology is based on code 'Termit'. Assessment of the critical strain was made taking into account possible presence of one or two defects: construction undercut or crack-like defect in a weld. On the base of the real structure analyses under any combinations of quasi-static loads the algorithm was developed for the computation of the liner. (author)

  18. Dynamics of energy systems: Methods of analysing technology change

    Energy Technology Data Exchange (ETDEWEB)

    Neij, Lena

    1999-05-01

    Technology change will have a central role in achieving a sustainable energy system. This calls for methods of analysing the dynamics of energy systems in view of technology change and policy instruments for effecting and accelerating technology change. In this thesis, such methods have been developed, applied, and assessed. Two types of methods have been considered, methods of analysing and projecting the dynamics of future technology change and methods of evaluating policy instruments effecting technology change, i.e. market transformation programmes. Two methods are focused on analysing the dynamics of future technology change; vintage models and experience curves. Vintage models, which allow for complex analysis of annual streams of energy and technological investments, are applied to the analysis of the time dynamics of electricity demand for lighting and air-distribution in Sweden. The results of the analyses show that the Swedish electricity demand for these purposes could decrease over time, relative to a reference scenario, if policy instruments are used. Experience curves are used to provide insight into the prospects of diffusion of wind turbines and photo voltaic (PV) modules due to cost reduction. The results show potential for considerable cost reduction for wind-generated electricity, which, in turn, could lead to major diffusion of wind turbines. The results also show that major diffusion of PV modules, and a reduction of PV generated electricity down to the level of conventional base-load electricity, will depend on large investments in bringing the costs down (through R D and D, market incentives and investments in niche markets) or the introduction of new generations of PV modules (e.g. high-efficiency mass-produced thin-film cells). Moreover, a model has been developed for the evaluation of market transformation programmes, i.e. policy instruments that effect technology change and the introduction and commercialisation of energy

  19. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  20. Measuring social capital through multivariate analyses for the IQ-SC.

    Science.gov (United States)

    Campos, Ana Cristina Viana; Borges, Carolina Marques; Vargas, Andréa Maria Duarte; Gomes, Viviane Elisangela; Lucas, Simone Dutra; Ferreira e Ferreira, Efigênia

    2015-01-20

    Social capital can be viewed as a societal process that works toward the common good as well as toward the good of the collective based on trust, reciprocity, and solidarity. Our study aimed to present two multivariate statistical analyses to examine the formation of latent classes of social capital using the IQ-SC and to identify the most important factors in building an indicator of individual social capital. A cross-sectional study was conducted in 2009 among working adolescents supported by a Brazilian NGO. The sample consisted of 363 individuals, and data were collected using the World Bank Questionnaire for measuring social capital. First, the participants were grouped by a segmentation analysis using the Two Step Cluster method based on the Euclidian distance and the centroid criteria as the criteria for aggregate answers. Using specific weights for each item, discriminant analysis was used to validate the cluster analysis in an attempt to maximize the variance among the groups with respect to the variance within the clusters. "Community participation" and "trust in one's neighbors" contributed significantly to the development of the model with two distinct discriminant functions (p < 0.001). The majority of cases (95.0%) and non-cases (93.1%) were correctly classified by discriminant analysis. The two multivariate analyses (segmentation analysis and canonical discriminant analysis), used together, can be considered good choices for measuring social capital. Our results indicate that it is possible to form three social capital groups (low, medium and high) using the IQ-SC.