WorldWideScience

Sample records for analyser based spectromicroscope

  1. Applications of high lateral and energy resolution imaging XPS with a double hemispherical analyser based spectromicroscope

    International Nuclear Information System (INIS)

    Escher, M.; Winkler, K.; Renault, O.; Barrett, N.

    2010-01-01

    The design and applications of an instrument for imaging X-ray photoelectron spectroscopy (XPS) are reviewed. The instrument is based on a photoelectron microscope and a double hemispherical analyser whose symmetric configuration avoids the spherical aberration (α 2 -term) inherent for standard analysers. The analyser allows high transmission imaging without sacrificing the lateral and energy resolution of the instrument. The importance of high transmission, especially for highest resolution imaging XPS with monochromated laboratory X-ray sources, is outlined and the close interrelation of energy resolution, lateral resolution and analyser transmission is illustrated. Chemical imaging applications using a monochromatic laboratory Al Kα-source are shown, with a lateral resolution of 610 nm. Examples of measurements made using synchrotron and laboratory ultra-violet light show the broad field of applications from imaging of core level electrons with chemical shift identification, high resolution threshold photoelectron emission microscopy (PEEM), work function imaging and band structure imaging.

  2. Energy-filtered real- and k-space secondary and energy-loss electron imaging with Dual Emission Electron spectro-Microscope: Cs/Mo(110)

    Energy Technology Data Exchange (ETDEWEB)

    Grzelakowski, Krzysztof P., E-mail: k.grzelakowski@opticon-nanotechnology.com

    2016-05-15

    Since its introduction the importance of complementary k{sub ||}-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800 eV electron beam from an “in-lens” electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered k{sub ǁ}-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. - Highlights: • A novel concept of the electron sample illumination with “in-lens” e- gun is realized. • Quasi-simultaneous energy selective observation of the real- and k-space in EELS mode. • Observation of the energy filtered Auger electron diffraction at Cs atoms on Mo(110). • Energy-loss, Auger and secondary electron momentum microscopy is realized.

  3. Unsupervised Data Mining in nanoscale X-ray Spectro-Microscopic Study of NdFeB Magnet.

    Science.gov (United States)

    Duan, Xiaoyue; Yang, Feifei; Antono, Erin; Yang, Wenge; Pianetta, Piero; Ermon, Stefano; Mehta, Apurva; Liu, Yijin

    2016-09-29

    Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanent magnet material, Nd 2 Fe 14 B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For example, it shows that the surface of common Nd 2 Fe 14 B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material's overall properties.

  4. Unsupervised Data Mining in nanoscale X-ray Spectro-Microscopic Study of NdFeB Magnet

    Science.gov (United States)

    Duan, Xiaoyue; Yang, Feifei; Antono, Erin; Yang, Wenge; Pianetta, Piero; Ermon, Stefano; Mehta, Apurva; Liu, Yijin

    2016-09-01

    Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanent magnet material, Nd2Fe14B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For example, it shows that the surface of common Nd2Fe14B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material’s overall properties.

  5. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  6. Spectromicroscope for the PHotoelectron Imaging of Nanostructures with X-rays (SPHINX): performance in biology, medicine and geology

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, B.H.; Girasole, Marco; Wiese, L.M.; Franz, Torsten; De Stasio, G

    2004-05-15

    Several X-ray PhotoElectron Emission spectroMicroscopes (X-PEEMs) exist around the world at this time. We present recent performance and resolution tests of one of them, the Spectromicroscope for PHotoelectron Imaging of Nanostructures with X-rays (SPHINX) X-PEEM, installed at the University of Wisconsin Synchrotron Radiation Center. With this state-of-the-art instrument we demonstrate chemical analysis capabilities on conducting and insulating specimens of diverse interests, and an unprecedented lateral resolution of 10 nm with monochromatic X-rays and 7.2 nm with ultraviolet illumination.

  7. Spectro-microscopic study of the formation of supramolecular networks

    Science.gov (United States)

    Sadowski, Jerzy T.

    2015-03-01

    Metal-organic frameworks (MOFs) are emerging as a new class of materials for CO2 capture. There are many fundamental questions, including the optimum pore size and arrangement of the molecules in the structure to achieve highest CO2 uptake. As only the surface is of interest for potential applications such as heterogeneous catalysis, nano-templating, and sensing, 2D analogs of MOFs can serve as good model systems. Utilizing capabilities of LEEM/PEEM for non-destructive interrogation of the real-time molecular self-assembly, we investigated supramolecular systems based on carboxylic acid-metal complexes, such as trimesic and mellitic acid, doped with transition metals. Such 2D networks act as host systems for transition-metal phthalocyanines (MPc; M = Fe, Ti, Sc) and the electrostatic interactions of CO2 molecules with transition metal ions, can be tuned by controlling the type of TM ion and the size of the pore in the host network. The understanding of directed self-assembly by controlling the molecule-substrate interaction can enable us to engineer the pore size and density, and thus tune the host's chemical activity. Research carried out at the Center for Functional Nanomaterials and National Synchrotron Light Source, Brookhaven National Laboratory, which are supported by the U.S. Department of Energy, Office of Basic Energy Sciences, under Contract No. DE-AC02-98CH10.

  8. PC based uranium enrichment analyser

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishana, K.R.; Bairi, B.R.

    1991-01-01

    It is important to measure enrichment of unirradiated nuclear fuel elements during production as a quality control measure. An IBM PC based system has recently been tested for enrichment measurements for Nuclear Fuel Complex (NFC), Hyderabad. As required by NFC, the system has ease of calibration. It is easy to switch the system from measuring enrichment of fuel elements to pellets and also automatically store the data and the results. The system uses an IBM PC plug in card to acquire data. The card incorporates programmable interval timers (8253-5). The counter/timer devices are executed by I/O mapped I/O's. A novel algorithm has been incorporated to make the system more reliable. The application software has been written in BASIC. (author). 9 refs., 1 fig

  9. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  10. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  11. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  12. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  13. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  14. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  15. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  16. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  17. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  18. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  19. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  20. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  1. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  2. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  3. Conducting Meta-Analyses Based on p Values

    Science.gov (United States)

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  4. PC based 8K multichannel analyser for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Jain, S.K.; Gupta, J.D.; Suman Kumari, B.

    1989-01-01

    An IBM-PC based 8K multichannel analyser(MCA) has been developed which incorporates all the features of an advanced system like very high throughput for data acquisition in PHA as well as MCS modes, fast real-time display, extensive display manipulation facilities, various present controls and concurrent data processing. The compact system hardware consists of a 2 bit wide NIM module and a PC add-on card. Because of external acquisition hardware, the system after initial programming by PC can acquire data independently allowing the PC to be switched off. To attain very high throughput, the most desirable feature of an MCA, a dual-port memory architecture has been used. The asymmetric dual-port RAM, housed in the NIM module offers 24 bit parallel access to the ADC and 8 bit wide access to PC which results in fast real-time histogramic display on the monitor. PC emulation software is menu driven and user friendly. It integrates a comprehensive set of commonly required application routines for concurrent data processing. After the transfer of know-how to the Electronic Corporation of India Ltd. (ECIL), this system is bein g produced at ECIL. (author). 5 refs., 4 figs

  5. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Suortti, Pekka; Keyriläinen, Jani; Thomlinson, William

    2013-01-01

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  6. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  7. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  8. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  9. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  10. X-ray photoelectron emission spectromicroscopic analysis of arborescent lycopsid cell wall composition and Carboniferous coal ball preservation

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, C. Kevin [Department of the Geophysical Sciences, University of Chicago, Chicago, IL 60637 (United States); Abrecht, Mike; Zhou, Dong; Gilbert, P.U.P.A. [Department of Physics, University of Wisconsin, Madison, WI 53706 (United States)

    2010-08-01

    Two alternative processes complicate understanding of the biochemical origins and geochemical alteration of organic matter over geologic time: selective preservation of original biopolymers and in situ generation of new geopolymers. One of the best constrained potential sources of bio- and geochemical information about extinct fossil plants is frequently overlooked. Permineralized anatomically preserved plant fossils allow analysis of individual cell and tissue types that have an original biochemical composition already known from living plants. The original composition of more enigmatic fossils can be constrained by geochemical comparisons to tissues of better understood fossils from the same locality. This strategy is possible using synchrotron-based techniques for submicron-scale imaging with X-rays over a range of frequencies in order to provide information concerning the relative abundance of different organic bonds with X-ray Absorption Near Edge Spectroscopy. In this study, X-ray PhotoElectron Emission spectroMicroscopy (X-PEEM) was used to analyze the tissues of Lepidodendron, one of the lycopsid trees that were canopy dominants of many Pennsylvanian coal swamp forests. Its periderm or bark - the single greatest biomass contributor to many Late Paleozoic coals - is found to have a greater aliphatic content and an overall greater density of organic matter than lignified wood. Because X-PEEM allows simultaneous analysis of organic matter and matrix calcite in fully mineralized fossils, this technique also has great potential for analysis of fossil preservation, including documentation of significant traces of organic matter entrained in the calcite crystal fabric that fills the cell lumens. (author)

  11. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  12. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  13. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  14. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  15. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  16. Evaluation of an optoacoustic based gas analysing device

    Science.gov (United States)

    Markmann, Janine; Lange, Birgit; Theisen-Kunde, Dirk; Danicke, Veit; Mayorov, Fedor; Eckert, Sebastian; Kettmann, Pascal; Brinkmann, Ralf

    2017-07-01

    The relative occurrence of volatile organic compounds in the human respiratory gas is disease-specific (ppb range). A prototype of a gas analysing device using two tuneable laser systems, an OPO-laser (2.5 to 10 μm) and a CO2-laser (9 to 11 μm), and an optoacoustic measurement cell was developed to detect concentrations in the ppb range. The sensitivity and resolution of the system was determined by test gas measurements, measuring ethylene and sulfur hexafluoride with the CO2-laser and butane with the OPO-laser. System sensitivity found to be 13 ppb for sulfur hexafluoride, 17 ppb for ethylene and Respiratory gas samples of 8 healthy volunteers were investigated by irradiation with 17 laser lines of the CO2-laser. Several of those lines overlap with strong absorption bands of ammonia. As it is known that ammonia concentration increases by age a separation of people 35 was striven for. To evaluate the data the first seven gas samples were used to train a discriminant analysis algorithm. The eighth subject was then assigned correctly to the group >35 years with the age of 49 years.

  17. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  18. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  19. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  20. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  1. PCR and RFLP analyses based on the ribosomal protein operon

    Science.gov (United States)

    Differentiation and classification of phytoplasmas have been primarily based on the highly conserved 16Sr RNA gene. RFLP analysis of 16Sr RNA gene sequences has identified 31 16Sr RNA (16Sr) groups and more than 100 16Sr subgroups. Classification of phytoplasma strains can however, become more refin...

  2. Analysing Leontiev Tube Capabilities in the Space-based Plants

    Directory of Open Access Journals (Sweden)

    N. L. Shchegolev

    2017-01-01

    Full Text Available The paper presents a review of publications dedicated to the gas-dynamic temperature stratification device (the Leontief tube and shows main factors affecting its efficiency. Describes an experimental installation, which is used to obtain data on the value of energy separation in the air to prove this device the operability.The assumption that there is an optimal relationship between the flow velocities in the subsonic and supersonic channels of the gas-dynamic temperature stratification device is experimentally confirmed.The paper conducts analysis of possible ways to raise the efficiency of power plants of various (including space basing, and shows that, currently, a mainstream of increasing efficiency of their operation is to complicate design solutions.A scheme of the closed gas-turbine space-based plant using a mixture of inert gases (helium-xenon one for operation is proposed. What differs it from the simplest variants is a lack of the cooler-radiator and integration into gas-dynamic temperature stratification device and heat compressor.Based on the equations of one-dimensional gas dynamics, it is shown that the total pressure restorability when removing heat in a thermal compressor determines operating capability of this scheme. The exploratory study of creating a heat compressor is performed, and it is shown that when operating on gases with a Prandtl number close to 1, the total pressure does not increase.The operating capability conditions of the heat compressor are operation on gases with a low value of the Prandtl number (helium-xenon mixture at high supersonic velocities and with a longitudinal pressure gradient available.It is shown that there is a region of the low values of the Prandtl number (Pr <0.3 for which, with the longitudinal pressure gradient available in the supersonic flows of a viscous gas, the total pressure can be restored.

  3. Design of the storage location based on the ABC analyses

    Science.gov (United States)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  4. Economic evaluation of algae biodiesel based on meta-analyses

    Science.gov (United States)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  5. Evidence for Endothermy in Pterosaurs Based on Flight Capability Analyses

    Science.gov (United States)

    Jenkins, H. S.; Pratson, L. F.

    2005-12-01

    Previous attempts to constrain flight capability in pterosaurs have relied heavily on the fossil record, using bone articulation and apparent muscle allocation to evaluate flight potential (Frey et al., 1997; Padian, 1983; Bramwell, 1974). However, broad definitions of the physical parameters necessary for flight in pterosaurs remain loosely defined and few systematic approaches to constraining flight capability have been synthesized (Templin, 2000; Padian, 1983). Here we present a new method to assess flight capability in pterosaurs as a function of humerus length and flight velocity. By creating an energy-balance model to evaluate the power required for flight against the power available to the animal, we derive a `U'-shaped power curve and infer optimal flight speeds and maximal wingspan lengths for pterosaurs Quetzalcoatlus northropi and Pteranodon ingens. Our model corroborates empirically derived power curves for the modern black-billed magpie ( Pica Pica) and accurately reproduces the mechanical power curve for modern cockatiels ( Nymphicus hollandicus) (Tobalske et al., 2003). When we adjust our model to include an endothermic metabolic rate for pterosaurs, we find a maximal wingspan length of 18 meters for Q. northropi. Model runs using an exothermic metabolism derive maximal wingspans of 6-8 meters. As estimates based on fossil evidence show total wingspan lengths reaching up to 15 meters for Q. northropi, we conclude that large pterosaurs may have been endothermic and therefore more metabolically similar to birds than to reptiles.

  6. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  7. Orbitrap-based mass analyser for in-situ characterization of asteroids: ILMA, Ion Laser Mass Analyser

    Science.gov (United States)

    Briois, C.; Cotti, H.; Thirkell, L.; Space Orbitrap Consortium[K. Aradj, French; Bouabdellah, A.; Boukrara, A.; Carrasco, N.; Chalumeau, G.; Chapelon, O.; Colin, F.; Coll, P.; Engrand, C.; Grand, N.; Kukui, A.; Lebreton, J.-P.; Pennanech, C.; Szopa, C.; Thissen, R.; Vuitton, V.; Zapf], P.; Makarov, A.

    2014-07-01

    Since about a decade the boundaries between comets and carbonaceous asteroids are fading [1,2]. No doubt that the Rosetta mission should bring a new wealth of data on the composition of comets. But as promising as it may look, the mass resolving power of the mass spectrometers onboard (so far the best on a space mission) will only be able to partially account for the diversity of chemical structures present. ILMA (Ion-Laser Mass Analyser) is a new generation high mass resolution LDI-MS (Laser Desorption-Ionization Mass Spectrometer) instrument concept using the Orbitrap technique, which has been developed in the frame of the two Marco Polo & Marco Polo-R proposals to the ESA Cosmic Vision program. Flagged by ESA as an instrument concept of interest for the mission in 2012, it has been under study for a few years in the frame of a Research and Technology (R&T) development programme between 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) [3,4], partly funded by the French Space Agency (CNES). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercialises Orbitrap-based laboratory instruments. The R&T activities are currently concentrating on the core elements of the Orbitrap analyser that are required to reach a sufficient maturity level for allowing design studies of future space instruments. A prototype is under development at LPC2E and a mass resolution (m/Δm FWHM) of 100,000 as been obtained at m/z = 150 for a background pressure of 10^{-8} mbar. ILMA would be a key instrument to measure the molecular, elemental and isotopic composition of objects such as carbonaceous asteroids, comets, or other bodies devoid of atmosphere such as the surface of an icy satellite, the Moon, or Mercury.

  8. Complementary Exploratory and Confirmatory Factor Analyses of the French WISC-V: Analyses Based on the Standardization Sample.

    Science.gov (United States)

    Lecerf, Thierry; Canivez, Gary L

    2017-12-28

    Interpretation of the French Wechsler Intelligence Scale for Children-Fifth Edition (French WISC-V; Wechsler, 2016a) is based on a 5-factor model including Verbal Comprehension (VC), Visual Spatial (VS), Fluid Reasoning (FR), Working Memory (WM), and Processing Speed (PS). Evidence for the French WISC-V factorial structure was established exclusively through confirmatory factor analyses (CFAs). However, as recommended by Carroll (1995); Reise (2012), and Brown (2015), factorial structure should derive from both exploratory factor analysis (EFA) and CFA. The first goal of this study was to examine the factorial structure of the French WISC-V using EFA. The 15 French WISC-V primary and secondary subtest scaled scores intercorrelation matrix was used and factor extraction criteria suggested from 1 to 4 factors. To disentangle the contribution of first- and second-order factors, the Schmid and Leiman (1957) orthogonalization transformation (SLT) was applied. Overall, no EFA evidence for 5 factors was found. Results indicated that the g factor accounted for about 67% of the common variance and that the contributions of the first-order factors were weak (3.6 to 11.9%). CFA was used to test numerous alternative models. Results indicated that bifactor models produced better fit to these data than higher-order models. Consistent with previous studies, findings suggested dominance of the general intelligence factor and that users should thus emphasize the Full Scale IQ (FSIQ) when interpreting the French WISC-V. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Real-time Bacterial Detection by Single Cell Based Sensors UsingSynchrotron FTIR Spectromicroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Veiseh, Mandana; Veiseh, Omid; Martin, Michael C.; Bertozzi,Carolyn; Zhang, Miqin

    2005-08-10

    Microarrays of single macrophage cell based sensors weredeveloped and demonstrated for real time bacterium detection bysynchrotron FTIR microscopy. The cells were patterned on gold-SiO2substrates via a surface engineering technique by which the goldelectrodes were immobilized with fibronectin to mediate cell adhesion andthe silicon oxide background were passivated with PEG to resist proteinadsorption and cell adhesion. Cellular morphology and IR spectra ofsingle, double, and triple cells on gold electrodes exposed tolipopolysaccharide (LPS) of different concentrations were compared toreveal the detection capabilities of these biosensors. The single-cellbased sensors were found to generate the most significant IR wave numbervariation and thus provide the highest detection sensitivity. Changes inmorphology and IR spectrum for single cells exposed to LPS were found tobe time- and concentration-dependent and correlated with each other verywell. FTIR spectra from single cell arrays of gold electrodes withsurface area of 25 mu-m2, 100 mu-m2, and 400 mu-m2 were acquired usingboth synchrotron and conventional FTIR spectromicroscopes to study thesensitivity of detection. The results indicated that the developedsingle-cell platform can be used with conventional FTIRspectromicroscopy. This technique provides real-time, label-free, andrapid bacterial detection, and may allow for statistic and highthroughput analyses, and portability.

  10. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  11. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  12. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies

    DEFF Research Database (Denmark)

    Gyöngyösi, Mariann; Wojakowski, Wojciech; Navarese, Eliano P

    2016-01-01

    In contrast to multiple publication-based meta-analyses involving clinical cardiac regeneration therapy in patients with recent myocardial infarction, a recently published meta-analysis based on individual patient data reported no effect of cell therapy on left ventricular function or clinical...

  13. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  15. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  16. Automatic image-based analyses using a coupled quadtree-SBFEM/SCM approach

    Science.gov (United States)

    Gravenkamp, Hauke; Duczek, Sascha

    2017-10-01

    Quadtree-based domain decomposition algorithms offer an efficient option to create meshes for automatic image-based analyses. Without introducing hanging nodes the scaled boundary finite element method (SBFEM) can directly operate on such meshes by only discretizing the edges of each subdomain. However, the convergence of a numerical method that relies on a quadtree-based geometry approximation is often suboptimal due to the inaccurate representation of the boundary. To overcome this problem a combination of the SBFEM with the spectral cell method (SCM) is proposed. The basic idea is to treat each uncut quadtree cell as an SBFEM polygon, while all cut quadtree cells are computed employing the SCM. This methodology not only reduces the required number of degrees of freedom but also avoids a two-dimensional quadrature in all uncut quadtree cells. Numerical examples including static, harmonic, modal and transient analyses of complex geometries are studied, highlighting the performance of this novel approach.

  17. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  18. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  19. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    Science.gov (United States)

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  20. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  1. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  3. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  4. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  5. A protein relational database and protein family knowledge bases to facilitate structure-based design analyses.

    Science.gov (United States)

    Mobilio, Dominick; Walker, Gary; Brooijmans, Natasja; Nilakantan, Ramaswamy; Denny, R Aldrin; Dejoannis, Jason; Feyfant, Eric; Kowticwar, Rupesh K; Mankala, Jyoti; Palli, Satish; Punyamantula, Sairam; Tatipally, Maneesh; John, Reji K; Humblet, Christine

    2010-08-01

    The Protein Data Bank is the most comprehensive source of experimental macromolecular structures. It can, however, be difficult at times to locate relevant structures with the Protein Data Bank search interface. This is particularly true when searching for complexes containing specific interactions between protein and ligand atoms. Moreover, searching within a family of proteins can be tedious. For example, one cannot search for some conserved residue as residue numbers vary across structures. We describe herein three databases, Protein Relational Database, Kinase Knowledge Base, and Matrix Metalloproteinase Knowledge Base, containing protein structures from the Protein Data Bank. In Protein Relational Database, atom-atom distances between protein and ligand have been precalculated allowing for millisecond retrieval based on atom identity and distance constraints. Ring centroids, centroid-centroid and centroid-atom distances and angles have also been included permitting queries for pi-stacking interactions and other structural motifs involving rings. Other geometric features can be searched through the inclusion of residue pair and triplet distances. In Kinase Knowledge Base and Matrix Metalloproteinase Knowledge Base, the catalytic domains have been aligned into common residue numbering schemes. Thus, by searching across Protein Relational Database and Kinase Knowledge Base, one can easily retrieve structures wherein, for example, a ligand of interest is making contact with the gatekeeper residue.

  6. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    La Pointe, P.R.; Wallmann, P.; Follin, S.

    1995-09-01

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  7. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A; Leppaemaeki, E; Koponen, P; Levander, J; Tapola, E [VTT Energy, Espoo (Finland). Energy Production Technologies

    1998-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  8. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  9. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  10. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  11. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  12. DESIGNING EAP MATERIALS BASED ON INTERCULTURAL CORPUS ANALYSES: THE CASE OF LOGICAL MARKERS IN RESEARCH ARTICLES

    Directory of Open Access Journals (Sweden)

    Pilar Mur Dueñas

    2009-10-01

    Full Text Available The ultimate aim of intercultural analyses in English for Academic Purposes is to help non-native scholars function successfully in the international disciplinary community in English. The aim of this paper is to show how corpus-based intercultural analyses can be useful to design EAP materials on a particular metadiscourse category, logical markers, in research article writing. The paper first describes the analysis carried out of additive, contrastive and consecutive logical markers in a corpus of research articles in English and in Spanish in a particular discipline, Business Management. Differences were found in their frequency and also in the use of each of the sub-categories. Then, five activities designed on the basis of these results are presented. They are aimed at raising Spanish Business scholars' awareness of the specific uses and pragmatic function of frequent logical markers in international research articles in English.

  13. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  14. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    Science.gov (United States)

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  15. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  16. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  17. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  18. Determination of the spatial response of neutron based analysers using a Monte Carlo based method

    International Nuclear Information System (INIS)

    Tickner, James

    2000-01-01

    One of the principal advantages of using thermal neutron capture (TNC, also called prompt gamma neutron activation analysis or PGNAA) or neutron inelastic scattering (NIS) techniques for measuring elemental composition is the high penetrating power of both the incident neutrons and the resultant gamma-rays, which means that large sample volumes can be interrogated. Gauges based on these techniques are widely used in the mineral industry for on-line determination of the composition of bulk samples. However, attenuation of both neutrons and gamma-rays in the sample and geometric (source/detector distance) effects typically result in certain parts of the sample contributing more to the measured composition than others. In turn, this introduces errors in the determination of the composition of inhomogeneous samples. This paper discusses a combined Monte Carlo/analytical method for estimating the spatial response of a neutron gauge. Neutron propagation is handled using a Monte Carlo technique which allows an arbitrarily complex neutron source and gauge geometry to be specified. Gamma-ray production and detection is calculated analytically which leads to a dramatic increase in the efficiency of the method. As an example, the method is used to study ways of reducing the spatial sensitivity of on-belt composition measurements of cement raw meal

  19. CrusView: a Java-based visualization platform for comparative genomics analyses in Brassicaceae species.

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-09-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/.

  20. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos

    2011-01-01

    present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case......Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...

  1. How distributed processing produces false negatives in voxel-based lesion-deficit analyses.

    Science.gov (United States)

    Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J

    2018-07-01

    In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients

  2. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  3. Individual-based analyses reveal limited functional overlap in a coral reef fish community.

    Science.gov (United States)

    Brandl, Simon J; Bellwood, David R

    2014-05-01

    Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on

  4. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  5. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  6. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  7. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  8. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-01-01

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  9. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  10. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  11. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  12. Reviewing PSA-based analyses to modify technical specifications at nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Martinez-Guridi, G.; Vesely, W.E.

    1995-12-01

    Changes to Technical Specifications (TSs) at nuclear power plants (NPPs) require review and approval by the United States Nuclear Regulatory Commission (USNRC). Currently, many requests for changes to TSs use analyses that are based on a plant's probabilistic safety assessment (PSA). This report presents an approach to reviewing such PSA-based submittals for changes to TSs. We discuss the basic objectives of reviewing a PSA-based submittal to modify NPP TSs; the methodology of reviewing a TS submittal, and the differing roles of a PSA review, a PSA Computer Code review, and a review of a TS submittal. To illustrate this approach, we discuss our review of changes to allowed outage time (AOT) and surveillance test interval (STI) in the TS for the South Texas Project Nuclear Generating Station. Based on this experience gained, a check-list of items is given for future reviewers; it can be used to verify that the submittal contains sufficient information, and also that the review has addressed the relevant issues. Finally, recommended steps in the review process and the expected findings of each step are discussed

  13. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  14. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  15. Chemometrical characterization of four italian rice varieties based on genetic and chemical analyses.

    Science.gov (United States)

    Brandolini, Vincenzo; Coïsson, Jean Daniel; Tedeschi, Paola; Barile, Daniela; Cereti, Elisabetta; Maietti, Annalisa; Vecchiati, Giorgio; Martelli, Aldo; Arlorio, Marco

    2006-12-27

    This paper describes a method for achieving qualitative identification of four rice varieties from two different Italian regions. To estimate the presence of genetic diversity among the four rice varieties, we used polymerase chain reaction-randomly amplified polymorphic DNA (PCR-RAPD) markers, and to elucidate whether a relationship exists between the ground and the specific characteristics of the product, we studied proximate composition, fatty acid composition, mineral content, and total antioxidant capacity. Using principal component analysis on genomic and compositional data, we were able to classify rice samples according to their variety and their district of production. This work also examined the discrimination ability of different parameters. It was found that genomic data give the best discrimination based on varieties, indicating that RAPD assays could be useful in discriminating among closely related species, while compositional analyses do not depend on the genetic characters only but are related to the production area.

  16. Stress and deflection analyses of floating roofs based on a load-modifying method

    International Nuclear Information System (INIS)

    Sun Xiushan; Liu Yinghua; Wang Jianbin; Cen Zhangzhi

    2008-01-01

    This paper proposes a load-modifying method for the stress and deflection analyses of floating roofs used in cylindrical oil storage tanks. The formulations of loads and deformations are derived according to the equilibrium analysis of floating roofs. Based on these formulations, the load-modifying method is developed to conduct a geometrically nonlinear analysis of floating roofs with the finite element (FE) simulation. In the procedure with the load-modifying method, the analysis is carried out through a series of iterative computations until a convergence is achieved within the error tolerance. Numerical examples are given to demonstrate the validity and reliability of the proposed method, which provides an effective and practical numerical solution to the design and analysis of floating roofs

  17. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  18. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  19. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  20. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  1. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.

    2010-01-01

    of long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser...

  2. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin

    2017-09-23

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  3. Genome based analyses of six hexacorallian species reject the “naked coral” hypothesis

    KAUST Repository

    Wang, Xin; Drillon, Gué nola; Ryu, Taewoo; Voolstra, Christian R.; Aranda, Manuel

    2017-01-01

    Scleractinian corals are the foundation species of the coral-reef ecosystem. Their calcium carbonate skeletons form extensive structures that are home to millions of species, making coral reefs one of the most diverse ecosystems of our planet. However, our understanding of how reef-building corals have evolved the ability to calcify and become the ecosystem builders they are today is hampered by uncertain relationships within their subclass Hexacorallia. Corallimorpharians have been proposed to originate from a complex scleractinian ancestor that lost the ability to calcify in response to increasing ocean acidification, suggesting the possibility for corals to lose and gain the ability to calcify in response to increasing ocean acidification. Here we employed a phylogenomic approach using whole-genome data from six hexacorallian species to resolve the evolutionary relationship between reef-building corals and their non-calcifying relatives. Phylogenetic analysis based on 1,421 single-copy orthologs, as well as gene presence/absence and synteny information, converged on the same topologies, showing strong support for scleractinian monophyly and a corallimorpharian sister clade. Our broad phylogenomic approach using sequence-based and sequence-independent analyses provides unambiguous evidence for the monophyly of scleractinian corals and the rejection of corallimorpharians as descendants of a complex coral ancestor.

  4. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  5. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  6. Analyses of microstructural and elastic properties of porous SOFC cathodes based on focused ion beam tomography

    Science.gov (United States)

    Chen, Zhangwei; Wang, Xin; Giuliani, Finn; Atkinson, Alan

    2015-01-01

    Mechanical properties of porous SOFC electrodes are largely determined by their microstructures. Measurements of the elastic properties and microstructural parameters can be achieved by modelling of the digitally reconstructed 3D volumes based on the real electrode microstructures. However, the reliability of such measurements is greatly dependent on the processing of raw images acquired for reconstruction. In this work, the actual microstructures of La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF) cathodes sintered at an elevated temperature were reconstructed based on dual-beam FIB/SEM tomography. Key microstructural and elastic parameters were estimated and correlated. Analyses of their sensitivity to the grayscale threshold value applied in the image segmentation were performed. The important microstructural parameters included porosity, tortuosity, specific surface area, particle and pore size distributions, and inter-particle neck size distribution, which may have varying extent of effect on the elastic properties simulated from the microstructures using FEM. Results showed that different threshold value range would result in different degree of sensitivity for a specific parameter. The estimated porosity and tortuosity were more sensitive than surface area to volume ratio. Pore and neck size were found to be less sensitive than particle size. Results also showed that the modulus was essentially sensitive to the porosity which was largely controlled by the threshold value.

  7. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  8. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  9. Voxel-based morphometry analyses of in-vivo MRI in the aging mouse lemur primate

    Directory of Open Access Journals (Sweden)

    Stephen John Sawiak

    2014-05-01

    Full Text Available Cerebral atrophy is one of the most widely brain alterations associated to aging. A clear relationship has been established between age-associated cognitive impairments and cerebral atrophy. The mouse lemur (Microcebus murinus is a small primate used as a model of age-related neurodegenerative processes. It is the first nonhuman primate in which cerebral atrophy has been correlated with cognitive deficits. Previous studies of cerebral atrophy in this model were based on time consuming manual delineation or measurement of selected brain regions from magnetic resonance images (MRI. These measures could not be used to analyse regions that cannot be easily outlined such as the nucleus basalis of Meynert or the subiculum. In humans, morphometric assessment of structural changes with age is generally performed with automated procedures such as voxel-based morphometry (VBM. The objective of our work was to perform user-independent assessment of age-related morphological changes in the whole brain of large mouse lemur populations thanks to VBM. The study was based on the SPMMouse toolbox of SPM 8 and involved thirty mouse lemurs aged from 1.9 to 11.3 years. The automatic method revealed for the first time atrophy in regions where manual delineation is prohibitive (nucleus basalis of Meynert, subiculum, prepiriform cortex, Brodmann areas 13-16, hypothalamus, putamen, thalamus, corpus callosum. Some of these regions are described as particularly sensitive to age-associated alterations in humans. The method revealed also age-associated atrophy in cortical regions (cingulate, occipital, parietal, nucleus septalis, and the caudate. Manual measures performed in some of these regions were in good agreement with results from automatic measures. The templates generated in this study as well as the toolbox for SPM8 can be downloaded. These tools will be valuable for future evaluation of various treatments that are tested to modulate cerebral aging in lemurs.

  10. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  11. Loss of Flow Accident (LOFA) analyses using LabView-based NRR simulator

    Energy Technology Data Exchange (ETDEWEB)

    Arafa, Amany Abdel Aziz; Saleh, Hassan Ibrahim [Atomic Energy Authority, Cairo (Egypt). Radiation Engineering Dept.; Ashoub, Nagieb [Atomic Energy Authority, Cairo (Egypt). Reactor Physics Dept.

    2016-12-15

    This paper presents a generic Loss of Flow Accident (LOFA) scenario module which is integrated in the LabView-based simulator to imitate a Nuclear Research Reactor (NRR) behavior for different user defined LOFA scenarios. It also provides analyses of a LOFA of a single fuel channel and its impact on operational transactions and on the behavior of the reactor. The generic LOFA scenario module includes graphs needed to clarify the effects of the LOFA under study. Furthermore, the percentage of the loss of mass flow rate, the mode of flow reduction and the start time and transient time of LOFA are user defined to add flexibility to the LOFA scenarios. The objective of integrating such generic LOFA module is to be able to deal with such incidents and avoid their significant effects. It is also useful in the development of expertise in this area and reducing the operator training and simulations costs. The results of the implemented generic LOFA module agree well with that of COBRA-IIIC code and the earlier guidebook for this series of transients.

  12. TAXONOMY AND GENETIC RELATIONSHIPS OF PANGASIIDAE, ASIAN CATFISHES, BASED ON MORPHOLOGICAL AND MOLECULAR ANALYSES

    Directory of Open Access Journals (Sweden)

    Rudhy Gustiano

    2007-12-01

    Full Text Available Pangasiids are economically important riverine catfishes generally residing in freshwater from the Indian subcontinent to the Indonesian Archipelago. The systematics of this family are still poorly known. Consequently, lack of such basic information impedes the understanding of the biology of the Pangasiids and the study of their aquaculture potential as well as improvement of seed production and growth performance. The objectives of the present study are to clarify phylogeny of this family based on a biometric analysis and molecular evidence using 12S ribosomal mtDNA on the total of 1070 specimens. The study revealed that 28 species are recognised as valid in Pangasiidae. Four genera are also recognized as Helicophagus Bleeker 1858, Pangasianodon Chevey 1930, Pteropangasius Fowler 1937, and Pangasius Valenciennes 1840 instead of two as reported by previous workers. The phylogenetic analysis demonstrated the recognised genera, and genetic relationships among taxa. Overall, trees from the different analyses show similar topologies and confirm the hypothesis derived from geological history, palaeontology, and similar models in other taxa of fishes from the same area. The oldest genus may already have existed when the Asian mainland was still connected to the islands in the southern part about 20 million years ago.

  13. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Melkerud, Per-Arne; Bain, Derek C.; Olsson, Mats T.

    2003-01-01

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO 3 , and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmol c m -2 yr -1 , amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  14. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  15. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  16. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  17. [Research on fast classification based on LIBS technology and principle component analyses].

    Science.gov (United States)

    Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng

    2014-11-01

    Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.

  18. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  19. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  20. Teleseism-based Relative Time Corrections for Modern Analyses of Digitized Analog Seismograms

    Science.gov (United States)

    Lee, T. A.; Ishii, M.

    2017-12-01

    With modern-day instruments and seismic networks timed by GPS systems, synchronization of data streams is all but a forgone conclusion. However, during the analog era, when each station had its own clock, comparing data timing from different stations was a far more daunting prospect. Today, with recently developed methods by which analog data can be digitized, having the ability to accurately reconcile the timings of two separate stations would open decades worth of data to modern analyses. For example, one possible and exciting application would be using noise interferometry with digitized analog data in order to investigate changing structural features (on a volcano for example) over a much longer timescale than was previously possible. With this in mind, we introduce a new approach to sync time between stations based on teleseismic arrivals. P-wave arrivals are identified at stations for pairs of earthquakes from the digital and analog eras that have nearly identical distances, locations, and depths. Assuming accurate timing of the modern data, relative time corrections between a pair of stations can then be inferred for the analog data. This method for time correction depends upon the analog stations having modern equivalents, and both having sufficiently long durations of operation to allow for recording of usable teleseismic events. The Hawaii Volcano Observatory (HVO) network is an especially ideal environment for this, as it not only has a large and well-preserved collection of analog seismograms, but also has a long operating history (1912 - present) with many of the older stations having modern equivalents. As such, the scope of this project is to calculate and apply relative time corrections to analog data from two HVO stations, HILB (1919-present) and UWE (1928-present)(HILB now part of Pacific Tsunami network). Further application of this method could be for investigation of the effects of relative clock-drift, that is, the determining factor for how

  1. Molecular Characterization of Five Potyviruses Infecting Korean Sweet Potatoes Based on Analyses of Complete Genome Sequences

    Directory of Open Access Journals (Sweden)

    Hae-Ryun Kwak

    2015-12-01

    Full Text Available Sweet potatoes (Ipomea batatas L. are grown extensively, in tropical and temperate regions, and are important food crops worldwide. In Korea, potyviruses, including Sweet potato feathery mottle virus (SPFMV, Sweet potato virus C (SPVC, Sweet potato virus G (SPVG, Sweet potato virus 2 (SPV2, and Sweet potato latent virus (SPLV, have been detected in sweet potato fields at a high (~95% incidence. In the present work, complete genome sequences of 18 isolates, representing the five potyviruses mentioned above, were compared with previously reported genome sequences. The complete genomes consisted of 10,081 to 10,830 nucleotides, excluding the poly-A tails. Their genomic organizations were typical of the Potyvirus genus, including one target open reading frame coding for a putative polyprotein. Based on phylogenetic analyses and sequence comparisons, the Korean SPFMV isolates belonged to the strains RC and O with >98% nucleotide sequence identity. Korean SPVC isolates had 99% identity to the Japanese isolate SPVC-Bungo and 70% identity to the SPFMV isolates. The Korean SPVG isolates showed 99% identity to the three previously reported SPVG isolates. Korean SPV2 isolates had 97% identity to the SPV2 GWB-2 isolate from the USA. Korean SPLV isolates had a relatively low (88% nucleotide sequence identity with the Taiwanese SPLV-TW isolates, and they were phylogenetically distantly related to SPFMV isolates. Recombination analysis revealed that possible recombination events occurred in the P1, HC-Pro and NIa-NIb regions of SPFMV and SPLV isolates and these regions were identified as hotspots for recombination in the sweet potato potyviruses.

  2. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    Directory of Open Access Journals (Sweden)

    Zhigang Zuo

    2014-04-01

    Full Text Available It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were discussed. A model with casing diameter of 0.875D40 was selected for further analyses. FEM analyses were then carried out on different combinations of casing sizes, casing wall thickness, and materials, to evaluate its safety related to pressure integrity, with respect to both static and fatigue strength analyses. Two models with forging and cast materials were selected as final results.

  3. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    International Nuclear Information System (INIS)

    Kitchen, Marcus J.; Pavlov, Konstantin M.; Hooper, Stuart B.; Vine, David J.; Siu, Karen K.W.; Wallace, Megan J.; Siew, Melissa L.L.; Yagi, Naoto; Uesugi, Kentaro; Lewis, Rob A.

    2008-01-01

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 μm thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution

  4. Simultaneous acquisition of dual analyser-based phase contrast X-ray images for small animal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kitchen, Marcus J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: Marcus.Kitchen@sci.monash.edu.au; Pavlov, Konstantin M. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia); Physics and Electronics, School of Science and Technology, University of New England, NSW 2351 (Australia)], E-mail: Konstantin.Pavlov@sci.monash.edu.au; Hooper, Stuart B. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Stuart.Hooper@med.monash.edu.au; Vine, David J. [School of Physics, Monash University, Victoria 3800 (Australia)], E-mail: David.Vine@sci.monash.edu.au; Siu, Karen K.W. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Karen.Siu@sci.monash.edu.au; Wallace, Megan J. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Megan.Wallace@med.monash.edu.au; Siew, Melissa L.L. [Department of Physiology, Monash University, Victoria 3800 (Australia)], E-mail: Melissa.Siew@med.monash.edu.au; Yagi, Naoto [SPring-8/JASRI, Sayo (Japan)], E-mail: yagi@spring8.or.jp; Uesugi, Kentaro [SPring-8/JASRI, Sayo (Japan)], E-mail: ueken@spring8.or.jp; Lewis, Rob A. [School of Physics, Monash University, Victoria 3800 (Australia); Monash Centre for Synchrotron Science, Monash University, Victoria 3800 (Australia)], E-mail: Rob.Lewis@sync.monash.edu.au

    2008-12-15

    Analyser-based phase contrast X-ray imaging can provide high-contrast images of biological tissues with exquisite sensitivity to the boundaries between tissues. The phase and absorption information can be extracted by processing multiple images acquired at different analyser orientations. Recording both the transmitted and diffracted beams from a thin Laue analyser crystal can make phase retrieval possible for dynamic systems by allowing full field imaging. This technique was used to image the thorax of a mechanically ventilated newborn rabbit pup using a 25 keV beam from the SPring-8 synchrotron radiation facility. The diffracted image was produced from the (1 1 1) planes of a 50 mm x 40 mm, 100 {mu}m thick Si analyser crystal in the Laue geometry. The beam and analyser were large enough to image the entire chest, making it possible to observe changes in anatomy with high contrast and spatial resolution.

  5. SeeSway - A free web-based system for analysing and exploring standing balance data.

    Science.gov (United States)

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate

  6. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  7. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    Science.gov (United States)

    2016-09-01

    performance study of these algorithms in the particular problem of analysing backscatter signals from rotating blades. The report is organised as follows...provide further insight into the behaviour of the techniques. Here, the algorithms for MP, OMP, CGP, gOMP and ROMP terminate when 10 atoms are

  8. Conformational determination of [Leu]enkephalin based on theoretical and experimental VA and VCD spectral analyses

    DEFF Research Database (Denmark)

    Abdali, Salim; Jalkanen, Karl J.; Cao, X.

    2004-01-01

    Conformational determination of [Leu]enkephalin in DMSO-d6 is carried out using VA and VCD spectral analyses. Conformational energies, vibrational frequencies and VA and VCD intensities are calculated using DFT at B3LYP/6-31G* level of theory. Comparison between the measured spectra...

  9. Molecular systematics of Indian Alysicarpus (Fabaceae) based on analyses of nuclear ribosomal DNA sequences.

    Science.gov (United States)

    Gholami, Akram; Subramaniam, Shweta; Geeta, R; Pandey, Arun K

    2017-06-01

    Alysicarpus Necker ex Desvaux (Fabaceae, Desmodieae) consists of ~30 species that are distributed in tropical and subtropical regions of theworld. In India, the genus is represented by ca. 18 species, ofwhich seven are endemic. Sequences of the nuclear Internal transcribed spacer from38 accessions representing 16 Indian specieswere subjected to phylogenetic analyses. The ITS sequence data strongly support the monophyly of the genus Alysicarpus. Analyses revealed four major well-supported clades within Alysicarpus. Ancestral state reconstructions were done for two morphological characters, namely calyx length in relation to pod (macrocalyx and microcalyx) and pod surface ornamentation (transversely rugose and nonrugose). The present study is the first report on molecular systematics of Indian Alysicarpus.

  10. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  11. Regional analyses of labor markets and demography: a model based Norwegian example.

    Science.gov (United States)

    Stambol, L S; Stolen, N M; Avitsland, T

    1998-01-01

    The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt

  12. Optimization of a Centrifugal Boiler Circulating Pump's Casing Based on CFD and FEM Analyses

    OpenAIRE

    Zhigang Zuo; Shuhong Liu; Yizhang Fan; Yulin Wu

    2014-01-01

    It is important to evaluate the economic efficiency of boiler circulating pumps in manufacturing process from the manufacturers' point of view. The possibility of optimizing the pump casing with respect to structural pressure integrity and hydraulic performance was discussed. CFD analyses of pump models with different pump casing sizes were firstly carried out for the hydraulic performance evaluation. The effects of the working temperature and the sealing ring on the hydraulic efficiency were...

  13. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH ?

    Energy Technology Data Exchange (ETDEWEB)

    HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.

    2002-09-19

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.

  14. Atmospheric radiation environment analyses based-on CCD camera at various mountain altitudes and underground sites

    Directory of Open Access Journals (Sweden)

    Li Cavoli Pierre

    2016-01-01

    Full Text Available The purpose of this paper is to discriminate secondary atmospheric particles and identify muons by measuring the natural radiative environment in atmospheric and underground locations. A CCD camera has been used as a cosmic ray sensor. The Low Noise Underground Laboratory of Rustrel (LSBB, France gives the access to a unique low-noise scientific environment deep enough to ensure the screening from the neutron and proton radiative components. Analyses of the charge levels in pixels of the CCD camera induced by radiation events and cartographies of the charge events versus the hit pixel are proposed.

  15. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  16. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  17. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    Science.gov (United States)

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  18. Activity Based Learning in a Freshman Global Business Course: Analyses of Preferences and Demographic Differences

    Science.gov (United States)

    Levine, Mark F.; Guy, Paul W.

    2007-01-01

    The present study investigates pre-business students' reaction to Activity Based Learning in a lower division core required course entitled Introduction to Global Business in the business curriculum at California State University Chico. The study investigates students' preference for Activity Based Learning in comparison to a more traditional…

  19. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  20. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  1. Analyses of integrated aircraft cabin contaminant monitoring network based on Kalman consensus filter.

    Science.gov (United States)

    Wang, Rui; Li, Yanxiao; Sun, Hui; Chen, Zengqiang

    2017-11-01

    The modern civil aircrafts use air ventilation pressurized cabins subject to the limited space. In order to monitor multiple contaminants and overcome the hypersensitivity of the single sensor, the paper constructs an output correction integrated sensor configuration using sensors with different measurement theories after comparing to other two different configurations. This proposed configuration works as a node in the contaminant distributed wireless sensor monitoring network. The corresponding measurement error models of integrated sensors are also proposed by using the Kalman consensus filter to estimate states and conduct data fusion in order to regulate the single sensor measurement results. The paper develops the sufficient proof of the Kalman consensus filter stability when considering the system and the observation noises and compares the mean estimation and the mean consensus errors between Kalman consensus filter and local Kalman filter. The numerical example analyses show the effectiveness of the algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Toxicity testing and chemical analyses of recycled fibre-based paper for food contact

    DEFF Research Database (Denmark)

    Binderup, Mona-Lise; Pedersen, Gitte Alsing; Vinggaard, Anne

    2002-01-01

    of different qualities as food-contact materials and to Perform a preliminary evaluation of their suitability from a safety point of view, and, second, to evaluate the use of different in vitro toxicity tests for screening of paper and board. Paper produced from three different categories of recycled fibres (B...... of the paper products were extracted with either 99% ethanol or water. Potential migrants in the extracts were identified and semiquantified by GC-1R-MS or GC-HRMS. In parallel to the chemical analyses, a battery of four different in vitro toxicity tests with different endpoints were applied to the same...... was less cytotoxic than the extracts prepared from paper made from recycled fibres, and extracts prepared from C was the most cytotoxic. None of the extracts showed mutagenic activity No conclusion about the oestrogenic activity could be made, because all extracts were cytotoxic to the test organism (yeast...

  3. Aroma profile of Garnacha Tintorera-based sweet wines by chromatographic and sensorial analyses.

    Science.gov (United States)

    Noguerol-Pato, R; González-Álvarez, M; González-Barreiro, C; Cancho-Grande, B; Simal-Gándara, J

    2012-10-15

    The aroma profiles obtained of three Garnacha Tintorera-based wines were studied: a base wine, a naturally sweet wine, and a mixture of naturally sweet wine with other sweet wine obtained by fortification with spirits. The aroma fingerprint was traced by GC-MS analysis of volatile compounds and by sensorial analysis of odours and tastes. Within the volatiles compounds, sotolon (73 μg/L) and acetoin (122 μg/L) were the two main compounds found in naturally sweet wine. With regards to the odorant series, those most dominant for Garnacha Tintorera base wine were floral, fruity and spicy. Instead, the most marked odorant series affected by off-vine drying of the grapes were floral, caramelized and vegetal-wood. Finally, odorant series affected by the switch-off of alcoholic fermentation with ethanol 96% (v/v) fit for human consumption followed by oak barrel aging were caramelized and vegetal-wood. A partial least square test (PLS-2) was used to detect correlations between sets of sensory data (those obtained with mouth and nose) with the ultimate aim of improving our current understanding of the flavour of Garnacha Tintorera red wines, both base and sweet. Based on the sensory dataset analysis, the descriptors with the highest weight for separating base and sweet wines from Garnacha Tintorera were sweetness, dried fruit and caramel (for sweet wines) vs. bitterness, astringency and geranium (for base wines). Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  5. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  6. Phylogenetic tree based on complete genomes using fractal and correlation analyses without sequence alignment

    Directory of Open Access Journals (Sweden)

    Zu-Guo Yu

    2006-06-01

    Full Text Available The complete genomes of living organisms have provided much information on their phylogenetic relationships. Similarly, the complete genomes of chloroplasts have helped resolve the evolution of this organelle in photosynthetic eukaryotes. In this review, we describe two algorithms to construct phylogenetic trees based on the theories of fractals and dynamic language using complete genomes. These algorithms were developed by our research group in the past few years. Our distance-based phylogenetic tree of 109 prokaryotes and eukaryotes agrees with the biologists' "tree of life" based on the 16S-like rRNA genes in a majority of basic branchings and most lower taxa. Our phylogenetic analysis also shows that the chloroplast genomes are separated into two major clades corresponding to chlorophytes s.l. and rhodophytes s.l. The interrelationships among the chloroplasts are largely in agreement with the current understanding on chloroplast evolution.

  7. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  8. Sediment Characteristics of Mergui Basin, Andaman Sea based on Multi-proxy Analyses

    Directory of Open Access Journals (Sweden)

    Rina Zuraida

    2018-02-01

    Full Text Available This paper presents the characteristics of sediment from core BS-36 (6°55.85’ S and 96°7.48’ E, 1147.1 m water depth that was acquired in the Mergui Basin, Andaman Sea. The analyses involved megascopic description, core scanning by multi-sensor core logger, and carbonate content measurement. The purpose of this study is to determine the physical and chemical characteristics of sediment to infer the depositional environment. The results show that this core can be divided into 5 lithologic units that represent various environmental conditions. The sedimentation of the bottom part, Units V and IV were inferred to be deposited in suboxic to anoxic bottom condition combined with high productivity and low precipitation. Unit III was deposited during high precipitation and oxic condition due to ocean ventilation. In the upper part, Units II and I occurred during higher precipitation, higher carbonate production and suboxic to anoxic condition. Keywords: sediment characteristics, Mergui Basin, Andaman Sea, suboxic, anoxic, oxic, carbonate content

  9. Revised age of deglaciation of Lake Emma based on new radiocarbon and macrofossil analyses

    Science.gov (United States)

    Elias, S.A.; Carrara, P.E.; Toolin, L.J.; Jull, A.J.T.

    1991-01-01

    Previous radiocarbon ages of detrital moss fragments in basal organic sediments of Lake Emma indicated that extensive deglaciation of the San Juan Mountains occurred prior to 14,900 yr B.P. (Carrara et al., 1984). Paleoecological analyses of insect and plant macrofossils from these basal sediments cast doubt on the reliability of the radiocarbon ages. Subsequent accelerator radiocarbon dates of insect fossils and wood fragments indicate an early Holocene age, rather than a late Pleistocene age, for the basal sediments of Lake Emma. These new radiocarbon ages suggest that by at least 10,000 yr B.P. deglaciation of the San Juan Mountains was complete. The insect and plant macrofossils from the basal organic sediments indicate a higher-than-present treeline during the early Holocene. The insect assemblages consisted of about 30% bark beetles, which contrasts markedly with the composition of insects from modern lake sediments and modern specimens collected in the Lake Emma cirque, in which bark beetles comprise only about 3% of the assemblages. In addition, in the fossil assemblages there were a number of flightless insect species (not subject to upslope transport by wind) indicative of coniferous forest environments. These insects were likewise absent in the modern assemblage. ?? 1991.

  10. Is autoimmunology a discipline of its own? A big data-based bibliometric and scientometric analyses.

    Science.gov (United States)

    Watad, Abdulla; Bragazzi, Nicola Luigi; Adawi, Mohammad; Amital, Howard; Kivity, Shaye; Mahroum, Naim; Blank, Miri; Shoenfeld, Yehuda

    2017-06-01

    Autoimmunology is a super-specialty of immunology specifically dealing with autoimmune disorders. To assess the extant literature concerning autoimmune disorders, bibliometric and scientometric analyses (namely, research topics/keywords co-occurrence, journal co-citation, citations, and scientific output trends - both crude and normalized, authors network, leading authors, countries, and organizations analysis) were carried out using open-source software, namely, VOSviewer and SciCurve. A corpus of 169,519 articles containing the keyword "autoimmunity" was utilized, selecting PubMed/MEDLINE as bibliographic thesaurus. Journals specifically devoted to autoimmune disorders were six and covered approximately 4.15% of the entire scientific production. Compared with all the corpus (from 1946 on), these specialized journals have been established relatively few decades ago. Top countries were the United States, Japan, Germany, United Kingdom, Italy, China, France, Canada, Australia, and Israel. Trending topics are represented by the role of microRNAs (miRNAs) in the ethiopathogenesis of autoimmune disorders, contributions of genetics and of epigenetic modifications, role of vitamins, management during pregnancy and the impact of gender. New subsets of immune cells have been extensively investigated, with a focus on interleukin production and release and on Th17 cells. Autoimmunology is emerging as a new discipline within immunology, with its own bibliometric properties, an identified scientific community and specifically devoted journals.

  11. Shielding analysis method applied to nuclear ship 'MUTSU' and its evaluation based on experimental analyses

    International Nuclear Information System (INIS)

    Yamaji, Akio; Miyakoshi, Jun-ichi; Iwao, Yoshiaki; Tsubosaka, Akira; Saito, Tetsuo; Fujii, Takayoshi; Okumura, Yoshihiro; Suzuoki, Zenro; Kawakita, Takashi.

    1984-01-01

    Procedures of shielding analysis are described which were used for the shielding modification design of the Nuclear Ship ''MUTSU''. The calculations of the radiation distribution on board were made using Sn codes ANISN and TWOTRAN, a point kernel code QAD and a Monte Carlo code MORSE. The accuracies of these calculations were investigated through the analysis of various shielding experiments: the shield tank experiment of the Nuclear Ship ''Otto Hahn'', the shielding mock-up experiment for ''MUTSU'' performed in JRR-4, the shielding benchmark experiment using the 16 N radiation facility of AERE Harwell and the shielding effect experiment of the ship structure performed in the training ship ''Shintoku-Maru''. The values calculated by the ANISN agree with the data measured at ''Otto Hahn'' within a factor of 2 for fast neutrons and within a factor of 3 for epithermal and thermal neutrons. The γ-ray dose rates calculated by the QAD agree with the measured values within 30% for the analysis of the experiment in JRR-4. The design values for ''MUTSU'' were determined in consequence of these experimental analyses. (author)

  12. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  13. Data analyses and modelling for risk based monitoring of mycotoxins in animal feed

    NARCIS (Netherlands)

    Ine van der Fels-Klerx, H.J.; Adamse, Paulien; Punt, Ans; Asselt, van Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study

  14. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  15. Analysing a Web-Based E-Commerce Learning Community: A Case Study in Brazil.

    Science.gov (United States)

    Joia, Luiz Antonio

    2002-01-01

    Demonstrates the use of a Web-based participative virtual learning environment for graduate students in Brazil enrolled in an electronic commerce course in a Masters in Business Administration program. Discusses learning communities; computer-supported collaborative work and collaborative learning; influences on student participation; the role of…

  16. Handbook of methods for risk-based analyses of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Samanta, P.K.; Kim, I.S. [Brookhaven National Lab., Upton, NY (United States); Mankamo, T. [Avaplan Oy, Espoo (Finland); Vesely, W.E. [Science Applications International Corp., Dublin, OH (United States)

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  17. Group analyses of connectivity-based cortical parcellation using repeated k-means clustering

    NARCIS (Netherlands)

    Nanetti, Luca; Cerliani, Leonardo; Gazzola, Valeria; Renken, Remco; Keysers, Christian

    2009-01-01

    K-means clustering has become a popular tool for connectivity-based cortical segmentation using Diffusion Weighted Imaging (DWI) data. A sometimes ignored issue is, however, that the output of the algorithm depends on the initial placement of starting points, and that different sets of starting

  18. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  19. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  20. Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database

    International Nuclear Information System (INIS)

    Lam, Chio; Zhou, Wenxing

    2016-01-01

    This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.

  1. White matter disruption in moderate/severe pediatric traumatic brain injury: Advanced tract-based analyses

    Directory of Open Access Journals (Sweden)

    Emily L. Dennis

    2015-01-01

    Full Text Available Traumatic brain injury (TBI is the leading cause of death and disability in children and can lead to a wide range of impairments. Brain imaging methods such as DTI (diffusion tensor imaging are uniquely sensitive to the white matter (WM damage that is common in TBI. However, higher-level analyses using tractography are complicated by the damage and decreased FA (fractional anisotropy characteristic of TBI, which can result in premature tract endings. We used the newly developed autoMATE (automated multi-atlas tract extraction method to identify differences in WM integrity. 63 pediatric patients aged 8–19 years with moderate/severe TBI were examined with cross sectional scanning at one or two time points after injury: a post-acute assessment 1–5 months post-injury and a chronic assessment 13–19 months post-injury. A battery of cognitive function tests was performed in the same time periods. 56 children were examined in the first phase, 28 TBI patients and 28 healthy controls. In the second phase 34 children were studied, 17 TBI patients and 17 controls (27 participants completed both post-acute and chronic phases. We did not find any significant group differences in the post-acute phase. Chronically, we found extensive group differences, mainly for mean and radial diffusivity (MD and RD. In the chronic phase, we found higher MD and RD across a wide range of WM. Additionally, we found correlations between these WM integrity measures and cognitive deficits. This suggests a distributed pattern of WM disruption that continues over the first year following a TBI in children.

  2. Geology of Southern Guinevere Planitia, Venus, based on analyses of Goldstone radar data

    International Nuclear Information System (INIS)

    Arvidson, R.E.; Plaut, J.J.; Jurgens, R.F.; Saunders, R.S.; Slade, M.A.

    1989-01-01

    The ensemble of 41 backscatter images of Venus acquired by the S Band (12.6 cm) Goldstone radar system covers approx. 35 million km and includes the equatorial portion of Guinevere Planitia, Navka Planitia, Heng-O Chasma, and Tinatin Planitia, and parts of Devana Chasma and Phoebe Regio. The images and associated altimetry data combine relatively high spatial resolution (1 to 10 km) with small incidence angles (less than 10 deg) for regions not covered by either Venera Orbiter or Arecibo radar data. Systematic analyses of the Goldstone data show that: (1) Volcanic plains dominate, including groups of small volcanic constructs, radar bright flows on a NW-SE arm of Phoebe Regio and on Ushas Mons and circular volcano-tectonic depressions; (2) Some of the regions imaged by Goldstone have high radar cross sections, including the flows on Ushas Mons and the NW-SE arm of Phoebe Regio, and several other unnamed hills, ridged terrains, and plains areas; (3) A 1000 km diameter multiringed structure is observed and appears to have a morphology not observed in Venera data (The northern section corresponds to Heng-O Chasma); (4) A 150 km wide, 2 km deep, 1400 km long rift valley with upturned flanks is located on the western flank of Phoebe Regio and extends into Devana Chasma; (5) A number of structures can be discerned in the Goldstone data, mainly trending NW-SE and NE-SW, directions similar to those discerned in Pioneer-Venus topography throughout the equatorial region; and (6) The abundance of circular and impact features is similar to the plains global average defined from Venera and Arecibo data, implying that the terrain imaged by Goldstone has typical crater retention ages, measured in hundreds of millions of years. The rate of resurfacing is less than or equal to 4 km/Ga

  3. Intra-specific genetic relationship analyses of Elaeagnus angustifolia based on RP-HPLC biochemical markers

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Elaeagnus angustifolia Linn. has various ecological, medicinal and economical uses. An approach was established using RP-HPLC (reversed-phase high-performance liquid chromatography) to classify and analyse the intra-specific genetic relationships of seventeen populations of E. angustifolia, collected from the Xinjiang areas of China. Chromatograms of alcohol-soluble proteins produced by seventeen populations ofE. angustifolia, were compared. Each chromatogram of alcohol-soluble proteins came from a single seed of one wild plant only. The results showed that when using a Waters Delta Pak. C18, 5 μm particle size reversed phase column (150 mm×3.9 mm), a linear gradient of 25%~60% solvent B with flow rate of 1 ml/min and run time of 67 min, the chromatography yielded optimum separation ofE. angustifolia alcohol-soluble proteins. Representative peaks in each population were chosen according to peak area and occurrence in every seed. The converted data on the elution peaks of each population were different and could be used to represent those populations. GSC (genetic similarity coefficients) of 41% to 62% showed a medium degree of genetic diversity among the populations in these eco-areas. Cluster analysis showed that the seventeen populations ofE. angustifolia could be divided into six clusters at the GSC=0.535 level and indicated the general and unique biochemical markers of these clusters. We suggest that E. angustifolia distribution in these eco-areas could be classified into six variable species. RP-HPLC was shown to be a rapid, repeatable and reliable method for E. angustifolia classification and identification and for analysis of genetic diversity.

  4. Improving the safety of a body composition analyser based on the PGNAA method

    Energy Technology Data Exchange (ETDEWEB)

    Miri-Hakimabad, Hashem; Izadi-Najafabadi, Reza; Vejdani-Noghreiyan, Alireza; Panjeh, Hamed [FUM Radiation Detection And Measurement Laboratory, Ferdowsi University of Mashhad (Iran, Islamic Republic of)

    2007-12-15

    The {sup 252}Cf radioisotope and {sup 241}Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. {sup 252}Cf and {sup 241}Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the {sup 252}Cf and {sup 241}Am-Be sources.

  5. Ecogeographical associations between climate and human body composition: analyses based on anthropometry and skinfolds.

    Science.gov (United States)

    Wells, Jonathan C K

    2012-02-01

    In the 19th century, two "ecogeographical rules" were proposed hypothesizing associations of climate with mammalian body size and proportions. Data on human body weight and relative leg length support these rules; however, it is unknown whether such associations are attributable to lean tissue (the heat-producing component) or fat (energy stores). Data on weight, height, and two skinfold thickness were obtained from the literature for 137 nonindustrialized populations, providing 145 male and 115 female individual samples. A variety of indices of adiposity and lean mass were analyzed. Preliminary analyses indicated secular increases in skinfolds in men but not women, and associations of age and height with lean mass in both sexes. Decreasing annual temperature was associated with increasing body mass index (BMI), and increasing triceps but not subscapular skinfold. After adjusting for skinfolds, decreasing temperature remained associated with increasing BMI. These results indicate that colder environments favor both greater peripheral energy stores, and greater lean mass. Contrasting results for triceps and subscapular skinfolds might be due to adaptive strategies either constraining central adiposity in cold environments to reduce cardiovascular risk, or favoring central adiposity in warmer environments to maintain energetic support of the immune system. Polynesian populations were analyzed separately and contradicted all of the climate trends, indicating support for the hypothesis that they are cold-adapted despite occupying a tropical region. It is unclear whether such associations emerge through natural selection or through trans-generational and life-course plasticity. These findings nevertheless aid understanding of the wide variability in human physique and adiposity. Copyright © 2011 Wiley Periodicals, Inc.

  6. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  7. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  8. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Directory of Open Access Journals (Sweden)

    H.J. (Ine van der Fels-Klerx

    2018-01-01

    Full Text Available Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products; all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials.

  9. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  10. Financial and Performance Analyses of Microcontroller Based Solar-Powered Autorickshaw for a Developing Country

    Directory of Open Access Journals (Sweden)

    Abu Raihan Mohammad Siddique

    2016-01-01

    Full Text Available This paper presents a case study to examine the economic viability and performance analysis of a microcontroller based solar powered battery operated autorickshaw (m-SBAR, for the developing countries, which is compared with different types of rickshaws such as pedal rickshaw (PR, battery operated autorickshaw (BAR, and solar-powered battery operated autorickshaw (SBAR, available in Bangladesh. The BAR consists of a rickshaw structure, a battery bank, a battery charge controller, a DC motor driver, and a DC motor whereas the proposed m-SBAR contains additional components like solar panel and microcontroller based DC motor driver. The complete design considered the local radiation data and load profile of the proposed m-SBAR. The Levelized Cost of Energy (LCOE analysis, Net Present Worth, payback periods, and Benefit-to-Cost Ratio methods have been used to evaluate the financial feasibility and sensitivity analysis of m-SBAR, grid-powered BAR, and PR. The numerical analysis reveals that LCOE and Benefit-to-Cost Ratio of the proposed m-SBAR are lower compared to the grid-powered BAR. It has also been found that microcontroller based DC motor control circuit reduces battery discharge rate, improves battery life, and controls motor speed efficiency.

  11. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Science.gov (United States)

    van der Fels-Klerx, H.J. (Ine); Adamse, Paulien; Punt, Ans; van Asselt, Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products) and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products); all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials. PMID:29373559

  12. Assessing an organizational culture instrument based on the Competing Values Framework: Exploratory and confirmatory factor analyses

    Science.gov (United States)

    Helfrich, Christian D; Li, Yu-Fang; Mohr, David C; Meterko, Mark; Sales, Anne E

    2007-01-01

    Background The Competing Values Framework (CVF) has been widely used in health services research to assess organizational culture as a predictor of quality improvement implementation, employee and patient satisfaction, and team functioning, among other outcomes. CVF instruments generally are presented as well-validated with reliable aggregated subscales. However, only one study in the health sector has been conducted for the express purpose of validation, and that study population was limited to hospital managers from a single geographic locale. Methods We used exploratory and confirmatory factor analyses to examine the underlying structure of data from a CVF instrument. We analyzed cross-sectional data from a work environment survey conducted in the Veterans Health Administration (VHA). The study population comprised all staff in non-supervisory positions. The survey included 14 items adapted from a popular CVF instrument, which measures organizational culture according to four subscales: hierarchical, entrepreneurial, team, and rational. Results Data from 71,776 non-supervisory employees (approximate response rate 51%) from 168 VHA facilities were used in this analysis. Internal consistency of the subscales was moderate to strong (α = 0.68 to 0.85). However, the entrepreneurial, team, and rational subscales had higher correlations across subscales than within, indicating poor divergent properties. Exploratory factor analysis revealed two factors, comprising the ten items from the entrepreneurial, team, and rational subscales loading on the first factor, and two items from the hierarchical subscale loading on the second factor, along with one item from the rational subscale that cross-loaded on both factors. Results from confirmatory factor analysis suggested that the two-subscale solution provides a more parsimonious fit to the data as compared to the original four-subscale model. Conclusion This study suggests that there may be problems applying conventional

  13. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  14. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  15. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Joe, Yang Hee; Cho, Sung Gook

    2003-01-01

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  16. Theoretical and Empirical Analyses of an Improved Harmony Search Algorithm Based on Differential Mutation Operator

    Directory of Open Access Journals (Sweden)

    Longquan Yong

    2012-01-01

    Full Text Available Harmony search (HS method is an emerging metaheuristic optimization algorithm. In this paper, an improved harmony search method based on differential mutation operator (IHSDE is proposed to deal with the optimization problems. Since the population diversity plays an important role in the behavior of evolution algorithm, the aim of this paper is to calculate the expected population mean and variance of IHSDE from theoretical viewpoint. Numerical results, compared with the HSDE, NGHS, show that the IHSDE method has good convergence property over a test-suite of well-known benchmark functions.

  17. Augmentation of French grunt diet description using combined visual and DNA-based analyses

    Science.gov (United States)

    Hargrove, John S.; Parkyn, Daryl C.; Murie, Debra J.; Demopoulos, Amanda W.J.; Austin, James D.

    2012-01-01

    Trophic linkages within a coral-reef ecosystem may be difficult to discern in fish species that reside on, but do not forage on, coral reefs. Furthermore, dietary analysis of fish can be difficult in situations where prey is thoroughly macerated, resulting in many visually unrecognisable food items. The present study examined whether the inclusion of a DNA-based method could improve the identification of prey consumed by French grunt, Haemulon flavolineatum, a reef fish that possesses pharyngeal teeth and forages on soft-bodied prey items. Visual analysis indicated that crustaceans were most abundant numerically (38.9%), followed by sipunculans (31.0%) and polychaete worms (5.2%), with a substantial number of unidentified prey (12.7%). For the subset of prey with both visual and molecular data, there was a marked reduction in the number of unidentified sipunculans (visual – 31.1%, combined &ndash 4.4%), unidentified crustaceans (visual &ndash 15.6%, combined &ndash 6.7%), and unidentified taxa (visual &ndash 11.1%, combined &ndash 0.0%). Utilising results from both methodologies resulted in an increased number of prey placed at the family level (visual &ndash 6, combined &ndash 33) and species level (visual &ndash 0, combined &ndash 4). Although more costly than visual analysis alone, our study demonstrated the feasibility of DNA-based identification of visually unidentifiable prey in the stomach contents of fish.

  18. VALUE-BASED MEDICINE AND OPHTHALMOLOGY: AN APPRAISAL OF COST-UTILITY ANALYSES

    Science.gov (United States)

    Brown, Gary C; Brown, Melissa M; Sharma, Sanjay; Brown, Heidi; Smithen, Lindsay; Leeser, David B; Beauchamp, George

    2004-01-01

    ABSTRACT Purpose To ascertain the extent to which ophthalmologic interventions have been evaluated in value-based medicine format. Methods Retrospective literature review. Papers in the healthcare literature utilizing cost-utility analysis were reviewed by researchers at the Center for Value-Based Medicine, Flourtown, Pennsylvania. A literature review of papers addressing the cost-utility analysis of ophthalmologic procedures in the United States over a 12-year period from 1992 to 2003 was undertaken using the National Library of Medicine and EMBASE databases. The cost-utility of ophthalmologic interventions in inflation-adjusted (real) year 2003 US dollars expended per quality-adjusted life-year ($/QALY) was ascertained in all instances. Results A total of 19 papers were found, including a total of 25 interventions. The median cost-utility of ophthalmologic interventions was $5,219/QALY, with a range from $746/QALY to $6.5 million/QALY. Conclusions The majority of ophthalmologic interventions are especially cost-effective by conventional standards. This is because of the substantial value that ophthalmologic interventions confer to patients with eye diseases for the resources expended. PMID:15747756

  19. Group analyses of connectivity-based cortical parcellation using repeated k-means clustering.

    Science.gov (United States)

    Nanetti, Luca; Cerliani, Leonardo; Gazzola, Valeria; Renken, Remco; Keysers, Christian

    2009-10-01

    K-means clustering has become a popular tool for connectivity-based cortical segmentation using Diffusion Weighted Imaging (DWI) data. A sometimes ignored issue is, however, that the output of the algorithm depends on the initial placement of starting points, and that different sets of starting points therefore could lead to different solutions. In this study we explore this issue. We apply k-means clustering a thousand times to the same DWI dataset collected in 10 individuals to segment two brain regions: the SMA-preSMA on the medial wall, and the insula. At the level of single subjects, we found that in both brain regions, repeatedly applying k-means indeed often leads to a variety of rather different cortical based parcellations. By assessing the similarity and frequency of these different solutions, we show that approximately 256 k-means repetitions are needed to accurately estimate the distribution of possible solutions. Using nonparametric group statistics, we then propose a method to employ the variability of clustering solutions to assess the reliability with which certain voxels can be attributed to a particular cluster. In addition, we show that the proportion of voxels that can be attributed significantly to either cluster in the SMA and preSMA is relatively higher than in the insula and discuss how this difference may relate to differences in the anatomy of these regions.

  20. A MULTI-AGENT BASED SOCIAL CRM FRAMEWORK FOR EXTRACTING AND ANALYSING OPINIONS

    Directory of Open Access Journals (Sweden)

    ABDELAZIZ EL FAZZIKI

    2017-08-01

    Full Text Available Social media provide a wide space for people from around the world to communicate, share knowledge and personal experiences. They increasingly become an important data source for opinion mining and sentiment analysis, thanks to shared comments and reviews about products and services. And companies are showing a growing interest to harness their potential, in order to support setting up marketing strategies. Despite the importance of sentiment analysis in decision making, there is a lack of social intelligence integration at the level of customer relationship management systems. Thus, social customer relationship management (SCRM systems have become an interesting research area. However, they need deep analytic techniques to transform the large amount of data “Big Data” into actionable insights. Such systems also require an advanced modelling and data processing methods, and must consider the emerging paradigm related to proactive systems. In this paper, we propose an agent based social framework that extracts and consolidates the reviews expressed via social media, in order to help enterprises know more about customers’ opinions toward a particular product or service. To illustrate our approach, we present the case study of Twitter reviews that we use to extract opinions and sentiment about a set of products using SentiGem API. Data extraction, analysis and storage are performed using a framework based on Hadoop MapReduce and HBase.

  1. Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-Rich DNA, and nuclear DNA analyses

    Science.gov (United States)

    Freeman, S.; Pham, M.; Rodriguez, R.J.

    1993-01-01

    Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-rich DNA, and nuclear DNA analyses. Experimental Mycology 17, 309-322. Isolates of Colletotrichum were grouped into 10 separate species based on arbitrarily primed PCR (ap-PCR), A + T-rich DNA (AT-DNA) and nuclear DNA banding patterns. In general, the grouping of Colletotrichum isolates by these molecular approaches corresponded to that done by classical taxonomic identification, however, some exceptions were observed. PCR amplification of genomic DNA using four different primers allowed for reliable differentiation between isolates of the 10 species. HaeIII digestion patterns of AT-DNA also distinguished between species of Colletotrichum by generating species-specific band patterns. In addition, hybridization of the repetitive DNA element (GcpR1) to genomic DNA identified a unique set of Pst 1-digested nuclear DNA fragments in each of the 10 species of Colletotrichum tested. Multiple isolates of C. acutatum, C. coccodes, C. fragariae, C. lindemuthianum, C. magna, C. orbiculare, C. graminicola from maize, and C. graminicola from sorghum showed 86-100% intraspecies similarity based on ap-PCR and AT-DNA analyses. Interspecies similarity determined by ap-PCR and AT-DNA analyses varied between 0 and 33%. Three distinct banding patterns were detected in isolates of C. gloeosporioides from strawberry. Similarly, three different banding patterns were observed among isolates of C. musae from diseased banana.

  2. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  3. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    Milani, Gabriele; Valente, Marco

    2014-01-01

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  4. Non-localization and localization ROC analyses using clinically based scoring

    Science.gov (United States)

    Paquerault, Sophie; Samuelson, Frank W.; Myers, Kyle J.; Smith, Robert C.

    2009-02-01

    We are investigating the potential for differences in study conclusions when assessing the estimated impact of a computer-aided detection (CAD) system on readers' performance. The data utilized in this investigation were derived from a multi-reader multi-case observer study involving one hundred mammographic background images to which fixed-size and fixed-intensity Gaussian signals were added, generating a low- and high-intensity signal sets. The study setting allowed CAD assessment in two situations: when CAD sensitivity was 1) superior or 2) lower than the average reader. Seven readers were asked to review each set in the unaided and CAD-aided reading modes, mark and rate their findings. Using this data, we studied the effect on study conclusion of three clinically-based receiver operating characteristic (ROC) scoring definitions. These scoring definitions included both location-specific and non-location-specific rules. The results showed agreement in the estimated impact of CAD on the overall reader performance. In the study setting where CAD sensitivity is superior to the average reader, the mean difference in AUC between the CAD-aided read and unaided read was 0.049 (95%CIs: -0.027; 0.130) for the image scoring definition that is based on non-location-specific rules, and 0.104 (95%CIs: 0.036; 0.174) and 0.090 (95%CIs: 0.031; 0.155) for image scoring definitions that are based on location-specific rules. The increases in AUC were statistically significant for the location-specific scoring definitions. It was further observed that the variance on these estimates was reduced when using the location-specific scoring definitions compared to that using a non-location-specific scoring definition. In the study setting where CAD sensitivity is equivalent or lower than the average reader, the mean differences in AUC are slightly above 0.01 for all image scoring definitions. These increases in AUC were not statistical significant for any of the image scoring definitions

  5. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    William Cross, R.; Montemorra, Michael

    2012-01-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in 2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  6. Identification of provenance rocks based on EPMA analyses of heavy minerals

    Science.gov (United States)

    Shimizu, M.; Sano, N.; Ueki, T.; Yonaga, Y.; Yasue, K. I.; Masakazu, N.

    2017-12-01

    Information on mountain building is significant in the field of geological disposal of high-level radioactive waste, because this affects long-term stability in groundwater flow system. Provenance analysis is one of effective approaches for understanding building process of mountains. Chemical compositions of heavy minerals, as well as their chronological data, can be an index for identification of provenance rocks. The accurate identification requires the measurement of as many grains as possible. In order to achieve an efficient provenance analysis, we developed a method for quick identification of heavy minerals using an Electron Probe Micro Analyzer (EPMA). In this method, heavy mineral grains extracted from a sample were aligned on a glass slide and mounted in a resin. Concentration of 28 elements was measured for 300-500 grains per sample using EPMA. To measure as many grains as possible, we prioritized swiftness of measurement over precision, configuring measurement time of about 3.5 minutes for each grain. Identification of heavy minerals was based on their chemical composition. We developed a Microsoft® Excel® spread sheet input criteria of mineral identification using a typical range of chemical compositions for each mineral. The grains of 110 wt.% total were rejected. The criteria of mineral identification were revised through the comparison between mineral identification by optical microscopy and chemical compositions of grains classified as "unknown minerals". Provenance rocks can be identified based on abundance ratio of identified minerals. If no significant difference of the abundance ratio was found among source rocks, chemical composition of specific minerals was used as another index. This method was applied to the sediments of some regions in Japan where provenance rocks had lithological variations but similar formation ages. Consequently, the provenance rocks were identified based on chemical compositions of heavy minerals resistant to

  7. Fuel assemblies mechanical behaviour improvements based on design changes and loading patterns computational analyses

    International Nuclear Information System (INIS)

    Marin, J.; Aullo, M.; Gutierrez, E.

    2001-01-01

    In the past few years, incomplete RCCA insertion events (IRI) have been taking place at some nuclear plants. Large guide thimble distortion caused by high compressive loads together with the irradiation induced material creep and growth, is considered as the primary cause of those events. This disturbing phenomenon is worsened when some fuel assemblies are deformed to the extent that they push the neighbouring fuel assemblies and the distortion is transmitted along the core. In order to better understand this mechanism, ENUSA has developed a methodology based on finite element core simulation to enable assessments on the propensity of a given core loading pattern to propagate the distortion along the core. At the same time, the core loading pattern could be decided interacting with nuclear design to obtain the optimum response under both, nuclear and mechanical point of views, with the objective of progressively attenuating the core distortion. (author)

  8. [The genotype-based haplotype relative risk and transmission disequilibrium test analyses of familial febrile convulsions].

    Science.gov (United States)

    Qi, Y; Wu, X; Guo, Z; Zhang, J; Pan, H; Li, M; Bao, X; Peng, J; Zou, L; Lin, Q

    1999-10-01

    To confirm the linkage of familial febrile convulsions to the short arm of chromosome 6(6p) or the long arm of chromosome 8(8q). The authors finished genotyping of Pst I locus on the coding region of heat shock protein (HSP) 70, 5'untranslated region of HSP70-1, 3' untranslated region of HSP70-2, D8S84 and D8S85. The data were processed by the genotype-based haplotype relative risk(GHRR) and transmission disequilibrium test(TDT) methods in PPAP. Some signs of association and disequilibrium between D8S85 and FC were shown by GHRR and TDT. A suspect linkage of familial febrile convulsions to the long arm of chromosome 8 has been proposed.

  9. Scenario-based analyses of energy system development and its environmental implications in Thailand

    International Nuclear Information System (INIS)

    Shrestha, Ram M.; Malla, Sunil; Liyanage, Migara H.

    2007-01-01

    Thailand is one of the fastest growing energy-intensive economies in Southeast Asia. To formulate sound energy policies in the country, it is important to understand the impact of energy use on the environment over the long-period. This study examines energy system development and its associated greenhouse gas and local air pollutant emissions under four scenarios in Thailand through the year 2050. The four scenarios involve different growth paths for economy, population, energy efficiency and penetration of renewable energy technologies. The paper assesses the changes in primary energy supply mix, sector-wise final energy demand, energy import dependency and CO 2 , SO 2 and NO x emissions under four scenarios using end-use based Asia-Pacific Integrated Assessment Model (AIM/Enduse) of Thailand. (author)

  10. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    Science.gov (United States)

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Space nuclear-power reactor design based on combined neutronic and thermal-fluid analyses

    International Nuclear Information System (INIS)

    Koenig, D.R.; Gido, R.G.; Brandon, D.I.

    1985-01-01

    The design and performance analysis of a space nuclear-power system requires sophisticated analytical capabilities such as those developed during the nuclear rocket propulsion (Rover) program. In particular, optimizing the size of a space nuclear reactor for a given power level requires satisfying the conflicting requirements of nuclear criticality and heat removal. The optimization involves the determination of the coolant void (volume) fraction for which the reactor diameter is a minimum and temperature and structural limits are satisfied. A minimum exists because the critical diameter increases with increasing void fraction, whereas the reactor diameter needed to remove a specified power decreases with void fraction. The purpose of this presentation is to describe and demonstrate our analytical capability for the determination of minimum reactor size. The analysis is based on combining neutronic criticality calculations with OPTION-code thermal-fluid calculations

  12. Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses

    Science.gov (United States)

    Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.

    2017-12-01

    To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number

  13. Actual situation analyses of rat-run traffic on community streets based on car probe data

    Science.gov (United States)

    Sakuragi, Yuki; Matsuo, Kojiro; Sugiki, Nao

    2017-10-01

    Lowering of so-called "rat-run" traffic on community streets has been one of significant challenges for improving the living environment of neighborhood. However, it has been difficult to quantitatively grasp the actual situation of rat-run traffic by the traditional surveys such as point observations. This study aims to develop a method for extracting rat-run traffic based on car probe data. In addition, based on the extracted rat-run traffic in Toyohashi city, Japan, we try to analyze the actual situation such as time and location distribution of the rat-run traffic. As a result, in Toyohashi city, the rate of using rat-run route increases in peak time period. Focusing on the location distribution of rat-run traffic, in addition, they pass through a variety of community streets. There is no great inter-district bias of the route frequently used as rat-run traffic. Next, we focused on some trips passing through a heavily used route as rat-run traffic. As a result, we found the possibility that they habitually use the route as rat-run because their trips had some commonalities. We also found that they tend to use the rat-run route due to shorter distance than using the alternative highway route, and that the travel speeds were faster than using the alternative highway route. In conclusions, we confirmed that the proposed method can quantitatively grasp the actual situation and the phenomenal tendencies of the rat-run traffic.

  14. Pattern Analyses Reveal Separate Experience-Based Fear Memories in the Human Right Amygdala.

    Science.gov (United States)

    Braem, Senne; De Houwer, Jan; Demanet, Jelle; Yuen, Kenneth S L; Kalisch, Raffael; Brass, Marcel

    2017-08-23

    Learning fear via the experience of contingencies between a conditioned stimulus (CS) and an aversive unconditioned stimulus (US) is often assumed to be fundamentally different from learning fear via instructions. An open question is whether fear-related brain areas respond differently to experienced CS-US contingencies than to merely instructed CS-US contingencies. Here, we contrasted two experimental conditions where subjects were instructed to expect the same CS-US contingencies while only one condition was characterized by prior experience with the CS-US contingency. Using multivoxel pattern analysis of fMRI data, we found CS-related neural activation patterns in the right amygdala (but not in other fear-related regions) that dissociated between whether a CS-US contingency had been instructed and experienced versus merely instructed. A second experiment further corroborated this finding by showing a category-independent neural response to instructed and experienced, but not merely instructed, CS presentations in the human right amygdala. Together, these findings are in line with previous studies showing that verbal fear instructions have a strong impact on both brain and behavior. However, even in the face of fear instructions, the human right amygdala still shows a separable neural pattern response to experience-based fear contingencies. SIGNIFICANCE STATEMENT In our study, we addressed a fundamental problem of the science of human fear learning and memory, namely whether fear learning via experience in humans relies on a neural pathway that can be separated from fear learning via verbal information. Using two new procedures and recent advances in the analysis of brain imaging data, we localized purely experience-based fear processing and memory in the right amygdala, thereby making a direct link between human and animal research. Copyright © 2017 the authors 0270-6474/17/378116-15$15.00/0.

  15. Model-based analyses to compare health and economic outcomes of cancer control: inclusion of disparities.

    Science.gov (United States)

    Goldie, Sue J; Daniels, Norman

    2011-09-21

    Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without

  16. Fatigue Crack Propagation Under Variable Amplitude Loading Analyses Based on Plastic Energy Approach

    Directory of Open Access Journals (Sweden)

    Sofiane Maachou

    2014-04-01

    Full Text Available Plasticity effects at the crack tip had been recognized as “motor” of crack propagation, the growth of cracks is related to the existence of a crack tip plastic zone, whose formation and intensification is accompanied by energy dissipation. In the actual state of knowledge fatigue crack propagation is modeled using crack closure concept. The fatigue crack growth behavior under constant amplitude and variable amplitude loading of the aluminum alloy 2024 T351 are analyzed using in terms energy parameters. In the case of VAL (variable amplitude loading tests, the evolution of the hysteretic energy dissipated per block is shown similar with that observed under constant amplitude loading. A linear relationship between the crack growth rate and the hysteretic energy dissipated per block is obtained at high growth rates. For lower growth rates values, the relationship between crack growth rate and hysteretic energy dissipated per block can represented by a power law. In this paper, an analysis of fatigue crack propagation under variable amplitude loading based on energetic approach is proposed.

  17. A method of mounting multiple otoliths for beam-based microchemical analyses

    Science.gov (United States)

    Donohoe, C.J.; Zimmerman, C.E.

    2010-01-01

    Beam-based analytical methods are widely used to measure the concentrations of elements and isotopes in otoliths. These methods usually require that otoliths be individually mounted and prepared to properly expose the desired growth region to the analytical beam. Most analytical instruments, such as LA-ICPMS and ion and electron microprobes, have sample holders that will accept only one to six slides or mounts at a time. We describe a method of mounting otoliths that allows for easy transfer of many otoliths to a single mount after they have been prepared. Such an approach increases the number of otoliths that can be analyzed in a single session by reducing the need open the sample chamber to exchange slides-a particularly time consuming step on instruments that operate under vacuum. For ion and electron microprobes, the method also greatly reduces the number of slides that must be coated with an electrical conductor prior to analysis. In this method, a narrow strip of cover glass is first glued at one end to a standard microscope slide. The otolith is then mounted in thermoplastic resin on the opposite, free end of the strip. The otolith can then be ground and flipped, if needed, by reheating the mounting medium. After otolith preparation is complete, the cover glass is cut with a scribe to free the otolith and up to 20 small otoliths can be arranged on a single petrographic slide. ?? 2010 The Author(s).

  18. Beam transient analyses of Accelerator Driven Subcritical Reactors based on neutron transport method

    Energy Technology Data Exchange (ETDEWEB)

    He, Mingtao; Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China); Zheng, Youqi, E-mail: yqzheng@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China); Wang, Kunpeng [Nuclear and Radiation Safety Center, PO Box 8088, Beijing 100082 (China); Li, Xunzhao; Zhou, Shengcheng [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049, Shaanxi (China)

    2015-12-15

    Highlights: • A transport-based kinetics code for Accelerator Driven Subcritical Reactors is developed. • The performance of different kinetics methods adapted to the ADSR is investigated. • The impacts of neutronic parameters deteriorating with fuel depletion are investigated. - Abstract: The Accelerator Driven Subcritical Reactor (ADSR) is almost external source dominated since there is no additional reactivity control mechanism in most designs. This paper focuses on beam-induced transients with an in-house developed dynamic analysis code. The performance of different kinetics methods adapted to the ADSR is investigated, including the point kinetics approximation and space–time kinetics methods. Then, the transient responds of beam trip and beam overpower are calculated and analyzed for an ADSR design dedicated for minor actinides transmutation. The impacts of some safety-related neutronics parameters deteriorating with fuel depletion are also investigated. The results show that the power distribution varying with burnup leads to large differences in temperature responds during transients, while the impacts of kinetic parameters and feedback coefficients are not very obvious. Classification: Core physic.

  19. Three-dimensional finite element model for flexible pavement analyses based field modulus measurements

    International Nuclear Information System (INIS)

    Lacey, G.; Thenoux, G.; Rodriguez-Roa, F.

    2008-01-01

    In accordance with the present development of empirical-mechanistic tools, this paper presents an alternative to traditional analysis methods for flexible pavements using a three-dimensional finite element formulation based on a liner-elastic perfectly-plastic Drucker-Pager model for granular soil layers and a linear-elastic stress-strain law for the asphalt layer. From the sensitivity analysis performed, it was found that variations of +-4 degree in the internal friction angle of granular soil layers did not significantly affect the analyzed pavement response. On the other hand, a null dilation angle is conservatively proposed for design purposes. The use of a Light Falling Weight Deflectometer is also proposed as an effective and practical tool for on-site elastic modulus determination of granular soil layers. However, the stiffness value obtained from the tested layer should be corrected when the measured peak deflection and the peak force do not occur at the same time. In addition, some practical observations are given to achieve successful field measurements. The importance of using a 3D FE analysis to predict the maximum tensile strain at the bottom of the asphalt layer (related to pavement fatigue) and the maximum vertical comprehensive strain transmitted to the top of the granular soil layers (related to rutting) is also shown. (author)

  20. [Health risks in different living circumstances of mothers. Analyses based on a population study].

    Science.gov (United States)

    Sperlich, Stefanie

    2014-12-01

    The objective of this study was to determine the living circumstances ('Lebenslagen') in mothers which are associated with elevated health risks. Data were derived from a cross-sectional population based sample of German women (n = 3129) with underage children. By means of a two-step cluster analysis ten different maternal living circumstances were assessed which proved to be distinct with respect to indicators of socioeconomic position, employment status and family-related factors. Out of the ten living circumstances, one could be attributed to higher socioeconomic status (SES), while five were assigned to a middle SES and four to a lower SES. In line with previous findings, mothers with a high SES predominantly showed the best health while mothers with a low SES tended to be at higher health risk with respect to subjective health, mental health (anxiety and depression), obesity and smoking. However, there were important health differences between the different living circumstances within the middle and lower SES. In addition, varying health risks were found among different living circumstances of single mothers, pointing to the significance of family and job-related living conditions in establishing health risks. With this exploratory analysis strategy small-scale living conditions could be detected which were associated with specific health risks. This approach seemed particularly suitable to provide a more precise definition of target groups for health promotion. The findings encourage a more exrensive application of the concept of living conditions in medical sociology research as well as health monitoring.

  1. Pareto frontier analyses based decision making tool for transportation of hazardous waste

    International Nuclear Information System (INIS)

    Das, Arup; Mazumder, T.N.; Gupta, A.K.

    2012-01-01

    Highlights: ► Posteriori method using multi-objective approach to solve bi-objective routing problem. ► System optimization (with multiple source–destination pairs) in a capacity constrained network using non-dominated sorting. ► Tools like cost elasticity and angle based focus used to analyze Pareto frontier to aid stakeholders make informed decisions. ► A real life case study of Kolkata Metropolitan Area to explain the workability of the model. - Abstract: Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology.

  2. UniPrimer: A Web-Based Primer Design Tool for Comparative Analyses of Primate Genomes

    Directory of Open Access Journals (Sweden)

    Nomin Batnyam

    2012-01-01

    Full Text Available Whole genome sequences of various primates have been released due to advanced DNA-sequencing technology. A combination of computational data mining and the polymerase chain reaction (PCR assay to validate the data is an excellent method for conducting comparative genomics. Thus, designing primers for PCR is an essential procedure for a comparative analysis of primate genomes. Here, we developed and introduced UniPrimer for use in those studies. UniPrimer is a web-based tool that designs PCR- and DNA-sequencing primers. It compares the sequences from six different primates (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque and designs primers on the conserved region across species. UniPrimer is linked to RepeatMasker, Primer3Plus, and OligoCalc softwares to produce primers with high accuracy and UCSC In-Silico PCR to confirm whether the designed primers work. To test the performance of UniPrimer, we designed primers on sample sequences using UniPrimer and manually designed primers for the same sequences. The comparison of the two processes showed that UniPrimer was more effective than manual work in terms of saving time and reducing errors.

  3. Analyses of Large Coal-Based SOFCs for High Power Stack Block Development

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P; Koeppel, Brian J

    2010-10-01

    This report summarizes the numerical modeling and analytical efforts for SOFC stack development performed for the coal-based SOFC program. The stack modeling activities began in 2004, but this report focuses on the most relevant results obtained since August 2008. This includes the latter half of Phase-I and all of Phase-II activities under technical guidance of VPS and FCE. The models developed to predict the thermal-flow-electrochemical behaviors and thermal-mechanical responses of generic planar stacks and towers are described. The effects of cell geometry, fuel gas composition, on-cell reforming, operating conditions, cell performance, seal leak, voltage degradation, boundary conditions, and stack height are studied. The modeling activities to evaluate and achieve technical targets for large stack blocks are described, and results from the latest thermal-fluid-electrochemical and structural models are summarized. Modeling results for stack modifications such as scale-up and component thickness reduction to realize cost reduction are presented. Supporting modeling activities in the areas of cell fabrication and loss of contact are also described.

  4. Metabolonote: A wiki-based database for managing hierarchical metadata of metabolome analyses

    Directory of Open Access Journals (Sweden)

    Takeshi eAra

    2015-04-01

    Full Text Available Metabolomics—technology for comprehensive detection of small molecules in an organism—lags behind the other omics in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata, existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called TogoMD, with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data, but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitates the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  5. Population genomic analyses based on 1 million SNPs in commercial egg layers.

    Directory of Open Access Journals (Sweden)

    Mahmood Gholami

    Full Text Available Identifying signatures of selection can provide valuable insight about the genes or genomic regions that are or have been under selective pressure, which can lead to a better understanding of genotype-phenotype relationships. A common strategy for selection signature detection is to compare samples from several populations and search for genomic regions with outstanding genetic differentiation. Wright's fixation index, FST, is a useful index for evaluation of genetic differentiation between populations. The aim of this study was to detect selective signatures between different chicken groups based on SNP-wise FST calculation. A total of 96 individuals of three commercial layer breeds and 14 non-commercial fancy breeds were genotyped with three different 600K SNP-chips. After filtering a total of 1 million SNPs were available for FST calculation. Averages of FST values were calculated for overlapping windows. Comparisons of these were then conducted between commercial egg layers and non-commercial fancy breeds, as well as between white egg layers and brown egg layers. Comparing non-commercial and commercial breeds resulted in the detection of 630 selective signatures, while 656 selective signatures were detected in the comparison between the commercial egg-layer breeds. Annotation of selection signature regions revealed various genes corresponding to productions traits, for which layer breeds were selected. Among them were NCOA1, SREBF2 and RALGAPA1 associated with reproductive traits, broodiness and egg production. Furthermore, several of the detected genes were associated with growth and carcass traits, including POMC, PRKAB2, SPP1, IGF2, CAPN1, TGFb2 and IGFBP2. Our approach demonstrates that including different populations with a specific breeding history can provide a unique opportunity for a better understanding of farm animal selection.

  6. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms.

    Science.gov (United States)

    Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T

    2016-05-23

    Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.

  7. Analysing Amazonian forest productivity using a new individual and trait-based model (TFS v.1)

    Science.gov (United States)

    Fyllas, N. M.; Gloor, E.; Mercado, L. M.; Sitch, S.; Quesada, C. A.; Domingues, T. F.; Galbraith, D. R.; Torre-Lezama, A.; Vilanova, E.; Ramírez-Angulo, H.; Higuchi, N.; Neill, D. A.; Silveira, M.; Ferreira, L.; Aymard C., G. A.; Malhi, Y.; Phillips, O. L.; Lloyd, J.

    2014-07-01

    Repeated long-term censuses have revealed large-scale spatial patterns in Amazon basin forest structure and dynamism, with some forests in the west of the basin having up to a twice as high rate of aboveground biomass production and tree recruitment as forests in the east. Possible causes for this variation could be the climatic and edaphic gradients across the basin and/or the spatial distribution of tree species composition. To help understand causes of this variation a new individual-based model of tropical forest growth, designed to take full advantage of the forest census data available from the Amazonian Forest Inventory Network (RAINFOR), has been developed. The model allows for within-stand variations in tree size distribution and key functional traits and between-stand differences in climate and soil physical and chemical properties. It runs at the stand level with four functional traits - leaf dry mass per area (Ma), leaf nitrogen (NL) and phosphorus (PL) content and wood density (DW) varying from tree to tree - in a way that replicates the observed continua found within each stand. We first applied the model to validate canopy-level water fluxes at three eddy covariance flux measurement sites. For all three sites the canopy-level water fluxes were adequately simulated. We then applied the model at seven plots, where intensive measurements of carbon allocation are available. Tree-by-tree multi-annual growth rates generally agreed well with observations for small trees, but with deviations identified for larger trees. At the stand level, simulations at 40 plots were used to explore the influence of climate and soil nutrient availability on the gross (ΠG) and net (ΠN) primary production rates as well as the carbon use efficiency (CU). Simulated ΠG, ΠN and CU were not associated with temperature. On the other hand, all three measures of stand level productivity were positively related to both mean annual precipitation and soil nutrient status

  8. Investigation of a wet ethanol operated HCCI engine based on first and second law analyses

    International Nuclear Information System (INIS)

    Khaliq, Abdul; Trivedi, Shailesh K.; Dincer, Ibrahim

    2011-01-01

    are in the HCCI engine (around 89%) followed by fuel vaporizer (4.9%) and catalytic converter (4.5%). → Based on simulation results, it is found that second law efficiency of wet ethanol operated HCCI engine is higher than the pure ethanol fuelled HCCI engine.

  9. Performance Analyses of Counter-Flow Closed Wet Cooling Towers Based on a Simplified Calculation Method

    Directory of Open Access Journals (Sweden)

    Xiaoqing Wei

    2017-02-01

    Full Text Available As one of the most widely used units in water cooling systems, the closed wet cooling towers (CWCTs have two typical counter-flow constructions, in which the spray water flows from the top to the bottom, and the moist air and cooling water flow in the opposite direction vertically (parallel or horizontally (cross, respectively. This study aims to present a simplified calculation method for conveniently and accurately analyzing the thermal performance of the two types of counter-flow CWCTs, viz. the parallel counter-flow CWCT (PCFCWCT and the cross counter-flow CWCT (CCFCWCT. A simplified cooling capacity model that just includes two characteristic parameters is developed. The Levenberg–Marquardt method is employed to determine the model parameters by curve fitting of experimental data. Based on the proposed model, the predicted outlet temperatures of the process water are compared with the measurements of a PCFCWCT and a CCFCWCT, respectively, reported in the literature. The results indicate that the predicted values agree well with the experimental data in previous studies. The maximum absolute errors in predicting the process water outlet temperatures are 0.20 and 0.24 °C for the PCFCWCT and CCFCWCT, respectively. These results indicate that the simplified method is reliable for performance prediction of counter-flow CWCTs. Although the flow patterns of the two towers are different, the variation trends of thermal performance are similar to each other under various operating conditions. The inlet air wet-bulb temperature, inlet cooling water temperature, air flow rate, and cooling water flow rate are crucial for determining the cooling capacity of a counter-flow CWCT, while the cooling tower effectiveness is mainly determined by the flow rates of air and cooling water. Compared with the CCFCWCT, the PCFCWCT is much more applicable in a large-scale cooling water system, and the superiority would be amplified when the scale of water

  10. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    Science.gov (United States)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  11. Sensitivity analyses of woody species exposed to air pollution based on ecophysiological measurements.

    Science.gov (United States)

    Wen, Dazhi; Kuang, Yuanwen; Zhou, Guoyi

    2004-01-01

    variation of Fv/Fm appeared in the other two species, particularly in M. chinensis, suggesting that they were more sensitive to air pollutants than I. rotunda. The mean LA was reduced for all species growing at the polluted site. The mean LMA for all species exceeded the sclerophylly threshold given by Cowling and Campbell and increased for those under pollution stress, which could be explained as one of the acclimation strategies for plants to air pollution stress. Little difference in leaf chlorophyll content was observed in F. microcarpa and M. chinensis, while remarkable differences were found in I. rotunda growing at the polluted and the clean site. Content of leaf carotenoids was largely reduced in I. rotunda growing at the polluted site, but increased in F. microcarpa and M. chinensis, compared with plants growing at the clean site. Plants growing at the clean site had a lower leaf N content than those growing at the polluted site. In addition, species with a higher resistance to pollution stress showed less difference in leaf N content than those sensitive species. Based on Fv/Fm measurements of the three woody species, I. rotunda showed the highest resistance to air pollutants from ceramic industries, followed by F. microcarpa. M. chinensis was the most sensitive species to air pollution, had lowest capacities to cope with the air pollution stress, which was consistent with visual injury symptoms observed in the crown profiles of plants at the polluted site. Fv/Fm, LAM, LA, leaf pigments and N content could be used alone or in combination to diagnose the extent of the physiological injury. The ratio of Fv/Fm, however, was the best and most effective parameter. Tree species which have higher air-pollutant resistance, as diagnosed by such ecophysiological parameters, should be considered first and planted widely for urban afforestation or forest regeneration in areas where the forest was seriously degraded or forest health was markedly effected by the same kind of

  12. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  13. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  14. CrusView: A Java-Based Visualization Platform for Comparative Genomics Analyses in Brassicaceae Species[OPEN

    Science.gov (United States)

    Chen, Hao; Wang, Xiangfeng

    2013-01-01

    In plants and animals, chromosomal breakage and fusion events based on conserved syntenic genomic blocks lead to conserved patterns of karyotype evolution among species of the same family. However, karyotype information has not been well utilized in genomic comparison studies. We present CrusView, a Java-based bioinformatic application utilizing Standard Widget Toolkit/Swing graphics libraries and a SQLite database for performing visualized analyses of comparative genomics data in Brassicaceae (crucifer) plants. Compared with similar software and databases, one of the unique features of CrusView is its integration of karyotype information when comparing two genomes. This feature allows users to perform karyotype-based genome assembly and karyotype-assisted genome synteny analyses with preset karyotype patterns of the Brassicaceae genomes. Additionally, CrusView is a local program, which gives its users high flexibility when analyzing unpublished genomes and allows users to upload self-defined genomic information so that they can visually study the associations between genome structural variations and genetic elements, including chromosomal rearrangements, genomic macrosynteny, gene families, high-frequency recombination sites, and tandem and segmental duplications between related species. This tool will greatly facilitate karyotype, chromosome, and genome evolution studies using visualized comparative genomics approaches in Brassicaceae species. CrusView is freely available at http://www.cmbb.arizona.edu/CrusView/. PMID:23898041

  15. Devising a New Model of Demand-Based Learning Integrated with Social Networks and Analyses of its Performance

    Directory of Open Access Journals (Sweden)

    Bekim Fetaji

    2018-02-01

    Full Text Available The focus of the research study is to devise a new model for demand based learning that will be integrated with social networks such as Facebook, twitter and other. The study investigates this by reviewing the published literature and realizes a case study analyses in order to analyze the new models’ analytical perspectives of practical implementation. The study focuses on analyzing demand-based learning and investigating how it can be improved by devising a specific model that incorporates social network use. Statistical analyses of the results of the questionnaire through research of the raised questions and hypothesis showed that there is a need for introducing new models in the teaching process. The originality stands on the prologue of the social login approach to an educational environment, whereas the approach is counted as a contribution of developing a demand-based web application, which aims to modernize the educational pattern of communication, introduce the social login approach, and increase the process of knowledge transfer as well as improve learners’ performance and skills. Insights and recommendations are provided, argumented and discussed.

  16. Design and development of microcontroller-based clinical chemistry analyser for measurement of various blood biochemistry parameters.

    Science.gov (United States)

    Taneja, S R; Gupta, R C; Kumar, Jagdish; Thariyan, K K; Verma, Sanjeev

    2005-01-01

    Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.

  17. Candelariella placodizans (Candelariaceae reported new to mainland China and Taiwan based on morphological, chemical and molecular phylogenetic analyses

    Directory of Open Access Journals (Sweden)

    Lidia Yakovchenko

    2016-06-01

    Full Text Available Candelariella placodizans is newly reported from China. It was collected on exposed rocks with mosses on the alpine areas of Taiwan and Yunnan Province, China at elevation between 3200-4400 m. Molecular phylogenetic analyses based on ITS rDNA sequences were also performed to confirm the monophyly of the Chinese populations with respect to already existing sequences of the species, and then further to examine their relationships to other members of the genus. An identification key to all 14 known taxa of Candelariella in China is provided.

  18. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Dincer, Ibrahim; Rosen, Marc A.

    2011-01-01

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO 2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO 2 emissions of the overall plant. The environmental impact in terms of CO 2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO 2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  19. Using a laser-based CO2 carbon isotope analyser to investigate gas transfer in geological media

    International Nuclear Information System (INIS)

    Guillon, S.; Pili, E.; Agrinier, P.

    2012-01-01

    CO 2 stable carbon isotopes are very attractive in environmental research to investigate both natural and anthropogenic carbon sources. Laser-based CO 2 carbon isotope analysis provides continuous measurement at high temporal resolution and is a promising alternative to isotope ratio mass spectrometry (IRMS). We performed a thorough assessment of a commercially available CO 2 Carbon Isotope Analyser (CCIA DLT-100, Los Gatos Research) that allows in situ measurement of C-13 in CO 2 . Using a set of reference gases of known CO 2 concentration and carbon isotopic composition, we evaluated the precision, long-term stability, temperature sensitivity and concentration dependence of the analyser. Despite good precision calculated from Allan variance (5.0 ppm for CO 2 concentration, and 0.05 per thousand for δC-13 at 60 s averaging), real performances are altered by two main sources of error: temperature sensitivity and dependence of C-13 on CO 2 concentration. Data processing is required to correct for these errors. Following application of these corrections, we achieve an accuracy of 8.7 ppm for CO 2 concentration and 1.3 per thousand for δC-13, which is worse compared to mass spectrometry performance, but still allowing field applications. With this portable analyser we measured CO 2 flux degassed from rock in an underground tunnel. The obtained carbon isotopic composition agrees with IRMS measurement, and can be used to identify the carbon source. (authors)

  20. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    Science.gov (United States)

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  1. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  2. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  3. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    Science.gov (United States)

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  4. Deciphering chicken gut microbial dynamics based on high-throughput 16S rRNA metagenomics analyses.

    Science.gov (United States)

    Mohd Shaufi, Mohd Asrore; Sieo, Chin Chin; Chong, Chun Wie; Gan, Han Ming; Ho, Yin Wan

    2015-01-01

    Chicken gut microbiota has paramount roles in host performance, health and immunity. Understanding the topological difference in gut microbial community composition is crucial to provide knowledge on the functions of each members of microbiota to the physiological maintenance of the host. The gut microbiota profiling of the chicken was commonly performed previously using culture-dependent and early culture-independent methods which had limited coverage and accuracy. Advances in technology based on next-generation sequencing (NGS), offers unparalleled coverage and depth in determining microbial gut dynamics. Thus, the aim of this study was to investigate the ileal and caecal microbiota development as chicken aged, which is important for future effective gut modulation. Ileal and caecal contents of broiler chicken were extracted from 7, 14, 21 and 42-day old chicken. Genomic DNA was then extracted and amplified based on V3 hyper-variable region of 16S rRNA. Bioinformatics, ecological and statistical analyses such as Principal Coordinate Analysis (PCoA) was performed in mothur software and plotted using PRIMER 6. Additional analyses for predicted metagenomes were performed through PICRUSt and STAMP software package based on Greengenes databases. A distinctive difference in bacterial communities was observed between ilea and caeca as the chicken aged (P microbial communities in the caeca were more diverse in comparison to the ilea communities. The potentially pathogenic bacteria such as Clostridium were elevated as the chicken aged and the population of beneficial microbe such as Lactobacillus was low at all intervals. On the other hand, based on predicted metagenomes analysed, clear distinction in functions and roles of gut microbiota such as gene pathways related to nutrient absorption (e.g. sugar and amino acid metabolism), and bacterial proliferation and colonization (e.g. bacterial motility proteins, two-component system and bacterial secretion system) were

  5. Model-based performance and energy analyses of reverse osmosis to reuse wastewater in a PVC production site.

    Science.gov (United States)

    Hu, Kang; Fiedler, Thorsten; Blanco, Laura; Geissen, Sven-Uwe; Zander, Simon; Prieto, David; Blanco, Angeles; Negro, Carlos; Swinnen, Nathalie

    2017-11-10

    A pilot-scale reverse osmosis (RO) followed behind a membrane bioreactor (MBR) was developed for the desalination to reuse wastewater in a PVC production site. The solution-diffusion-film model (SDFM) based on the solution-diffusion model (SDM) and the film theory was proposed to describe rejections of electrolyte mixtures in the MBR effluent which consists of dominant ions (Na + and Cl - ) and several trace ions (Ca 2+ , Mg 2+ , K + and SO 4 2- ). The universal global optimisation method was used to estimate the ion permeability coefficients (B) and mass transfer coefficients (K) in SDFM. Then, the membrane performance was evaluated based on the estimated parameters which demonstrated that the theoretical simulations were in line with the experimental results for the dominant ions. Moreover, an energy analysis model with the consideration of limitation imposed by the thermodynamic restriction was proposed to analyse the specific energy consumption of the pilot-scale RO system in various scenarios.

  6. An Evaluation Quality Framework for Analysing School-Based Learning (SBL) to Work-Based Learning (WBL) Transition Module

    International Nuclear Information System (INIS)

    Alseddiqi, M; Mishra, R; Pislaru, C

    2012-01-01

    The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.

  7. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  8. Fossil-based comparative analyses reveal ancient marine ancestry erased by extinction in ray-finned fishes.

    Science.gov (United States)

    Betancur-R, Ricardo; Ortí, Guillermo; Pyron, Robert Alexander

    2015-05-01

    The marine-freshwater boundary is a major biodiversity gradient and few groups have colonised both systems successfully. Fishes have transitioned between habitats repeatedly, diversifying in rivers, lakes and oceans over evolutionary time. However, their history of habitat colonisation and diversification is unclear based on available fossil and phylogenetic data. We estimate ancestral habitats and diversification and transition rates using a large-scale phylogeny of extant fish taxa and one containing a massive number of extinct species. Extant-only phylogenetic analyses indicate freshwater ancestry, but inclusion of fossils reveal strong evidence of marine ancestry in lineages now restricted to freshwaters. Diversification and colonisation dynamics vary asymmetrically between habitats, as marine lineages colonise and flourish in rivers more frequently than the reverse. Our study highlights the importance of including fossils in comparative analyses, showing that freshwaters have played a role as refuges for ancient fish lineages, a signal erased by extinction in extant-only phylogenies. © 2015 John Wiley & Sons Ltd/CNRS.

  9. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  10. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  11. Neutronics-processing interface analyses for the Accelerator Transmutation of Waste (ATW) aqueous-based blanket system

    International Nuclear Information System (INIS)

    Davidson, J.W.; Battat, M.E.

    1993-01-01

    Neutronics-processing interface parameters have large impacts on the neutron economy and transmutation performance of an aqueous-based Accelerator Transmutation of Waste (ATW) system. A detailed assessment of the interdependence of these blanket neutronic and chemical processing parameters has been performed. Neutronic performance analyses require that neutron transport calculations for the ATW blanket systems be fully coupled with the blanket processing and include all neutron absorptions in candidate waste nuclides as well as in fission and transmutation products. The effects of processing rates, flux levels, flux spectra, and external-to-blanket inventories on blanket neutronic performance were determined. In addition, the inventories and isotopics in the various subsystems were also calculated for various actinide and long-lived fission product transmutation strategies

  12. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    Science.gov (United States)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  13. Multi-person and multi-attribute design evaluations using evidential reasoning based on subjective safety and cost analyses

    International Nuclear Information System (INIS)

    Wang, J.; Yang, J.B.; Sen, P.

    1996-01-01

    This paper presents an approach for ranking proposed design options based on subjective safety and cost analyses. Hierarchical system safety analysis is carried out using fuzzy sets and evidential reasoning. This involves safety modelling by fuzzy sets at the bottom level of a hierarchy and safety synthesis by evidential reasoning at higher levels. Fuzzy sets are also used to model the cost incurred for each design option. An evidential reasoning approach is then employed to synthesise the estimates of safety and cost, which are made by multiple designers. The developed approach is capable of dealing with problems of multiple designers, multiple attributes and multiple design options to select the best design. Finally, a practical engineering example is presented to demonstrate the proposed multi-person and multi-attribute design selection approach

  14. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  15. The occurrence of Toxocara malaysiensis in cats in China, confirmed by sequence-based analyses of ribosomal DNA.

    Science.gov (United States)

    Li, Ming-Wei; Zhu, Xing-Quan; Gasser, Robin B; Lin, Rui-Qing; Sani, Rehana A; Lun, Zhao-Rong; Jacobs, Dennis E

    2006-10-01

    Non-isotopic polymerase chain reaction (PCR)-based single-strand conformation polymorphism and sequence analyses of the second internal transcribed spacer (ITS-2) of nuclear ribosomal DNA (rDNA) were utilized to genetically characterise ascaridoids from dogs and cats from China by comparison with those from other countries. The study showed that Toxocara canis, Toxocara cati, and Toxascaris leonina from China were genetically the same as those from other geographical origins. Specimens from cats from Guangzhou, China, which were morphologically consistent with Toxocara malaysiensis, were the same genetically as those from Malaysia, with the exception of a polymorphism in the ITS-2 but no unequivocal sequence difference. This is the first report of T. malaysiensis in cats outside of Malaysia (from where it was originally described), supporting the proposal that this species has a broader geographical distribution. The molecular approach employed provides a powerful tool for elucidating the biology, epidemiology, and zoonotic significance of T. malaysiensis.

  16. Formalisation des bases méthodologiques et conceptuelles d'une analyse spatiale des accidents de la route

    Directory of Open Access Journals (Sweden)

    Florence Huguenin Richard

    1999-06-01

    Full Text Available Cet article pose les bases méthodologiques et conceptuelles d’une analyse spatiale du risque routier. L’étude de ce phénomène requiert une masse importante de données qui décrivent différentes dimensions de l’accident et qui peuvent être gérées dans un système d’information géographique. Elle demande aussi une réflexion méthodologique sur la cartographie du risque, les échelles d’observation, l’agrégation de données qualitatives et quantitatives, l’utilisation de méthodes statistiques adaptées au risque routier et l'intégration de l’espace comme facteur d’insécurité.

  17. A Systematic Review of Cardiovascular Outcomes-Based Cost-Effectiveness Analyses of Lipid-Lowering Therapies.

    Science.gov (United States)

    Wei, Ching-Yun; Quek, Ruben G W; Villa, Guillermo; Gandra, Shravanthi R; Forbes, Carol A; Ryder, Steve; Armstrong, Nigel; Deshpande, Sohan; Duffy, Steven; Kleijnen, Jos; Lindgren, Peter

    2017-03-01

    Previous reviews have evaluated economic analyses of lipid-lowering therapies using lipid levels as surrogate markers for cardiovascular disease. However, drug approval and health technology assessment agencies have stressed that surrogates should only be used in the absence of clinical endpoints. The aim of this systematic review was to identify and summarise the methodologies, weaknesses and strengths of economic models based on atherosclerotic cardiovascular disease event rates. Cost-effectiveness evaluations of lipid-lowering therapies using cardiovascular event rates in adults with hyperlipidaemia were sought in Medline, Embase, Medline In-Process, PubMed and NHS EED and conference proceedings. Search results were independently screened, extracted and quality checked by two reviewers. Searches until February 2016 retrieved 3443 records, from which 26 studies (29 publications) were selected. Twenty-two studies evaluated secondary prevention (four also assessed primary prevention), two considered only primary prevention and two included mixed primary and secondary prevention populations. Most studies (18) based treatment-effect estimates on single trials, although more recent evaluations deployed meta-analyses (5/10 over the last 10 years). Markov models (14 studies) were most commonly used and only one study employed discrete event simulation. Models varied particularly in terms of health states and treatment-effect duration. No studies used a systematic review to obtain utilities. Most studies took a healthcare perspective (21/26) and sourced resource use from key trials instead of local data. Overall, reporting quality was suboptimal. This review reveals methodological changes over time, but reporting weaknesses remain, particularly with respect to transparency of model reporting.

  18. Exergy and energy analyses of two different types of PCM based thermal management systems for space air conditioning applications

    International Nuclear Information System (INIS)

    Tyagi, V.V.; Pandey, A.K.; Buddhi, D.; Tyagi, S.K.

    2013-01-01

    Highlights: ► Calcium chloride hexahydrate (CaCl 2 ⋅6H 2 O) as a PCM was used in this study. ► Two different capsulated system (HDPE based panel and balls) were designed. ► The results of CaCl 2 ⋅6H 2 O are very attractive for space air conditioning. ► Energy and exergy analyses for space cooling applications. - Abstract: This communication presents the experimental study of PCM based thermal management systems for space heating and cooling applications using energy and exergy analysis. Two different types of based thermal management system (TMS-I and TMS-II) using calcium chloride hexahydrate as the heat carrier has been designed, fabricated and studied for space heating and cooling applications at a typical climatic zone in India. In the first experimental arrangement the charging of PCM has been carried out with air conditioning system while discharging has been carried out using electric heater for both the thermal management systems. While in the second arrangement the charging of PCM has been carried out by solar energy and the discharging has been carried out by circulating the cooler ambient air during the night time. In the first experiment, TMS-I is found to be more effective than that of TMS-II while it was found to be reverse in the case of second experiment for both the charging and discharging processes not only for energetic but also for the exergetic performances

  19. Analyses of the soil surface dynamic of South African Kalahari salt pans based on hyperspectral and multitemporal data

    Science.gov (United States)

    Milewski, Robert; Chabrillat, Sabine; Behling, Robert; Mielke, Christian; Schleicher, Anja Maria; Guanter, Luis

    2016-04-01

    The consequences of climate change represent a major threat to sustainable development and growth in Southern Africa. Understanding the impact on the geo- and biosphere is therefore of great importance in this particular region. In this context the Kalahari salt pans (also known as playas or sabkhas) and their peripheral saline and alkaline habitats are an ecosystem of major interest. They are very sensitive to environmental conditions, and as thus hydrological, mineralogical and ecological responses to climatic variations can be analysed. Up to now the soil composition of salt pans in this area have been only assessed mono-temporally and on a coarse regional scale. Furthermore, the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. High spectral resolution remote sensing can estimate evaporite content and mineralogy of soils based on the analyses of the surface reflectance properties within the Visible-Near InfraRed (VNIR 400-1000 nm) and Short-Wave InfraRed (SWIR 1000-2500 nm) regions. In these wavelength regions major chemical components of the soil interact with the electromagnetic radiation and produce characteristic absorption features that can be used to derive the properties of interest. Although such techniques are well established for the laboratory and field scale, the potential of current (Hyperion) and upcoming spaceborne sensors such as EnMAP for quantitative mineralogical and salt spectral mapping is still to be demonstrated. Combined with hyperspectral methods, multitemporal remote sensing techniques allow us to derive the recent dynamic of these salt pans and link the mineralogical analysis of the pan surface to major physical processes in these dryland environments. In this study we focus on the analyses of the Namibian Omongwa salt pans based on satellite hyperspectral imagery and multispectral time-series data. First, a change detection analysis is applied using the Iterative

  20. Population-based cost-offset analyses for disorder-specific treatment of anorexia nervosa and bulimia nervosa in Germany.

    Science.gov (United States)

    Bode, Katharina; Götz von Olenhusen, Nina Maria; Wunsch, Eva-Maria; Kliem, Sören; Kröger, Christoph

    2017-03-01

    Previous research has shown that anorexia nervosa (AN) and bulimia nervosa (BN) are expensive illnesses to treat. To reduce their economic burden, adequate interventions need to be established. Our objective was to conduct cost-offset analyses for evidence-based treatment of eating disorders using outcome data from a psychotherapy trial involving cognitive behavioral therapy (CBT) and focal psychodynamic therapy (FPT) for AN and a trial involving CBT for BN. Assuming a currently running, ideal healthcare system using a 12-month, prevalence-based approach and varying the willingness to participate in treatment, we investigated whether the potential financial benefits of AN- and BN-related treatment outweigh the therapy costs at the population level. We elaborated on a formula that allows calculating cost-benefit relationships whereby the calculation of the parameters is based on estimates from data of health institutions within the German healthcare system. Additional intangible benefits were calculated with the aid of Quality-Adjusted Life Years. The annual costs of an untreated eating disorder were 2.38 billion EUR for AN and 617.69 million EUR for BN. Independent of the willingness to participate in treatment, the cost-benefit relationships for the treatment remained constant at 2.51 (CBT) and 2.33 (FPT) for AN and 4.05 (CBT) for BN. This consistency implies that for each EUR invested in the treatment, between 2.33 and 4.05 EUR could be saved each year. Our findings suggest that the implementation of evidence-based psychotherapy treatments for AN and BN may achieve substantial cost savings at the population level. © 2017 Wiley Periodicals, Inc.

  1. Limitations of Species Delimitation Based on Phylogenetic Analyses: A Case Study in the Hypogymnia hypotrypa Group (Parmeliaceae, Ascomycota.

    Directory of Open Access Journals (Sweden)

    Xinli Wei

    Full Text Available Delimiting species boundaries among closely related lineages often requires a range of independent data sets and analytical approaches. Similar to other organismal groups, robust species circumscriptions in fungi are increasingly investigated within an empirical framework. Here we attempt to delimit species boundaries in a closely related clade of lichen-forming fungi endemic to Asia, the Hypogymnia hypotrypa group (Parmeliaceae. In the current classification, the Hypogymnia hypotrypa group includes two species: H. hypotrypa and H. flavida, which are separated based on distinctive reproductive modes, the former producing soredia but absent in the latter. We reexamined the relationship between these two species using phenotypic characters and molecular sequence data (ITS, GPD, and MCM7 sequences to address species boundaries in this group. In addition to morphological investigations, we used Bayesian clustering to identify potential genetic groups in the H. hypotrypa/H. flavida clade. We also used a variety of empirical, sequence-based species delimitation approaches, including: the "Automatic Barcode Gap Discovery" (ABGD, the Poisson tree process model (PTP, the General Mixed Yule Coalescent (GMYC, and the multispecies coalescent approach BPP. Different species delimitation scenarios were compared using Bayes factors delimitation analysis, in addition to comparisons of pairwise genetic distances, pairwise fixation indices (FST. The majority of the species delimitation analyses implemented in this study failed to support H. hypotrypa and H. flavida as distinct lineages, as did the Bayesian clustering analysis. However, strong support for the evolutionary independence of H. hypotrypa and H. flavida was inferred using BPP and further supported by Bayes factor delimitation. In spite of rigorous morphological comparisons and a wide range of sequence-based approaches to delimit species, species boundaries in the H. hypotrypa group remain uncertain

  2. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.; Haroon, Mohamed; Ngugi, David; Thompson, Luke R.; Stingl, Ulrich

    2016-01-01

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  3. Distribution of Prochlorococcus Ecotypes in the Red Sea Basin Based on Analyses of rpoC1 Sequences

    KAUST Repository

    Shibl, Ahmed A.

    2016-06-25

    The marine picocyanobacteria Prochlorococcus represent a significant fraction of the global pelagic bacterioplankton community. Specifically, in the surface waters of the Red Sea, they account for around 91% of the phylum Cyanobacteria. Previous work suggested a widespread presence of high-light (HL)-adapted ecotypes in the Red Sea with the occurrence of low-light (LL)-adapted ecotypes at intermediate depths in the water column. To obtain a more comprehensive dataset over a wider biogeographical scope, we used a 454-pyrosequencing approach to analyze the diversity of the Prochlorococcus rpoC1 gene from a total of 113 samples at various depths (up to 500 m) from 45 stations spanning the Red Sea basin from north to south. In addition, we analyzed 45 metagenomes from eight stations using hidden Markov models based on a set of reference Prochlorococcus genomes to (1) estimate the relative abundance of Prochlorococcus based on 16S rRNA gene sequences, and (2) identify and classify rpoC1 sequences as an assessment of the community structure of Prochlorococcus in the northern, central and southern regions of the basin without amplification bias. Analyses of metagenomic data indicated that Prochlorococcus occurs at a relative abundance of around 9% in samples from surface waters (25, 50, 75 m), 3% in intermediate waters (100 m) and around 0.5% in deep-water samples (200–500 m). Results based on rpoC1 sequences using both methods showed that HL II cells dominate surface waters and were also present in deep-water samples. Prochlorococcus communities in intermediate waters (100 m) showed a higher diversity and co-occurrence of low-light and high-light ecotypes. Prochlorococcus communities at each depth range (surface, intermediate, deep sea) did not change significantly over the sampled transects spanning most of the Saudi waters in the Red Sea. Statistical analyses of rpoC1 sequences from metagenomes indicated that the vertical distribution of Prochlorococcus in the water

  4. Post test analyses of Revisa benchmark based on a creep test at 1100 Celsius degrees performed on a notched tube

    International Nuclear Information System (INIS)

    Fischer, M.; Bernard, A.; Bhandari, S.

    2001-01-01

    In the Euratom 4. Framework Program of the European Commission, REVISA Project deals with the Reactor Vessel Integrity under Severe Accidents. One of the tasks consists in the experimental validation of the models developed in the project. To do this, a benchmark was designed where the participants use their models to test the results against an experiment. The experiment called RUPTHER 15 was conducted by the coordinating organisation, CEA (Commissariat a l'Energie Atomique) in France. It is a 'delayed fracture' test on a notched tube. Thermal loading is an axial gradient with a temperature of about 1130 C in the mid-part. Internal pressure is maintained at 0.8 MPa. This paper presents the results of Finite Element calculations performed by Framatome-ANP using the SYSTUS code. Two types of analyses were made: -) one based on the 'time hardening' Norton-Bailey creep law, -) the other based on the coupled creep/damage Lemaitre-Chaboche model. The purpose of this paper is in particular to show the influence of temperature on the simulation results. At high temperatures of the kind dealt with here, slight errors in the temperature measurements can lead to very large differences in the deformation behaviour. (authors)

  5. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic - Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences.

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered.

  6. Hedysarum L. (Fabaceae: Hedysareae) Is Not Monophyletic – Evidence from Phylogenetic Analyses Based on Five Nuclear and Five Plastid Sequences

    Science.gov (United States)

    Liu, Pei-Liang; Wen, Jun; Duan, Lei; Arslan, Emine; Ertuğrul, Kuddisi; Chang, Zhao-Yang

    2017-01-01

    The legume family (Fabaceae) exhibits a high level of species diversity and evolutionary success worldwide. Previous phylogenetic studies of the genus Hedysarum L. (Fabaceae: Hedysareae) showed that the nuclear and the plastid topologies might be incongruent, and the systematic position of the Hedysarum sect. Stracheya clade was uncertain. In this study, phylogenetic relationships of Hedysarum were investigated based on the nuclear ITS, ETS, PGDH, SQD1, TRPT and the plastid psbA-trnH, trnC-petN, trnL-trnF, trnS-trnG, petN-psbM sequences. Both nuclear and plastid data support two major lineages in Hedysarum: the Hedysarum s.s. clade and the Sartoria clade. In the nuclear tree, Hedysarum is biphyletic with the Hedysarum s.s. clade sister to the Corethrodendron + Eversmannia + Greuteria + Onobrychis clade (the CEGO clade), whereas the Sartoria clade is sister to the genus Taverniera DC. In the plastid tree, Hedysarum is monophyletic and sister to Taverniera. The incongruent position of the Hedysarum s.s. clade between the nuclear and plastid trees may be best explained by a chloroplast capture hypothesis via introgression. The Hedysarum sect. Stracheya clade is resolved as sister to the H. sect. Hedysarum clade in both nuclear and plastid trees, and our analyses support merging Stracheya into Hedysarum. Based on our new evidence from multiple sequences, Hedysarum is not monophyletic, and its generic delimitation needs to be reconsidered. PMID:28122062

  7. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    Science.gov (United States)

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  8. Measurements and simulations analysing the noise behaviour of grating-based X-ray phase-contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Weber, T., E-mail: thomas.weber@physik.uni-erlangen.de [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Bartl, P.; Durst, J. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); Haas, W. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany); University of Erlangen-Nuremberg, Pattern Recognition Lab, Martensstr. 3, 91058 Erlangen (Germany); Michel, T.; Ritter, A.; Anton, G. [University of Erlangen-Nuremberg, ECAP - Erlangen Center for Astroparticle Physics, Erwin-Rommel-Str. 1, 91058 Erlangen (Germany)

    2011-08-21

    In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary. With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases. These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool 'SPHINX', combining both wave and particle contributions of the simulated photons. The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant. Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements. This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.

  9. Cost consequences due to reduced ulcer healing times - analyses based on the Swedish Registry of Ulcer Treatment.

    Science.gov (United States)

    Öien, Rut F; Forssell, Henrik; Ragnarson Tennvall, Gunnel

    2016-10-01

    Resource use and costs for topical treatment of hard-to-heal ulcers based on data from the Swedish Registry of Ulcer Treatment (RUT) were analysed in patients recorded in RUT as having healed between 2009 and 2012, in order to estimate potential cost savings from reductions in frequency of dressing changes and healing times. RUT is used to capture areas of improvement in ulcer care and to enable structured wound management by registering patients with hard-to-heal leg, foot and pressure ulcers. Patients included in the registry are treated in primary care, community care, private care, and inpatient hospital care. Cost calculations were based on resource use data on healing time and frequency of dressing changes in Swedish patients with hard-to-heal ulcers who healed between 2009 and 2012. Per-patient treatment costs decreased from SEK38 223 in 2009 to SEK20 496 in 2012, mainly because of shorter healing times. Frequency of dressing changes was essentially the same during these years, varying from 1·4 to 1·6 per week. The total healing time was reduced by 38%. Treatment costs for the management of hard-to-heal ulcers can be reduced with well-developed treatment strategies resulting in shortened healing times as shown in RUT. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  10. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    Science.gov (United States)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  11. Free Vibration Analyses of FGM Thin Plates by Isogeometric Analysis Based on Classical Plate Theory and Physical Neutral Surface

    Directory of Open Access Journals (Sweden)

    Shuohui Yin

    2013-01-01

    Full Text Available The isogeometric analysis with nonuniform rational B-spline (NURBS based on the classical plate theory (CPT is developed for free vibration analyses of functionally graded material (FGM thin plates. The objective of this work is to provide an efficient and accurate numerical simulation approach for the nonhomogeneous thin plates and shells. Higher order basis functions can be easily obtained in IGA, thus the formulation of CPT based on the IGA can be simplified. For the FGM thin plates, material property gradient in the thickness direction is unsymmetrical about the midplane, so effects of midplane displacements cannot be ignored, whereas the CPT neglects midplane displacements. To eliminate the effects of midplane displacements without introducing new unknown variables, the physical neutral surface is introduced into the CPT. The approximation of the deflection field and the geometric description are performed by using the NURBS basis functions. Compared with the first-order shear deformation theory, the present method has lower memory consumption and higher efficiency. Several numerical results show that the present method yields highly accurate solutions.

  12. Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.

    2017-10-01

    In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.

  13. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  14. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Science.gov (United States)

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  15. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Directory of Open Access Journals (Sweden)

    Arika Ligmann-Zielinska

    Full Text Available Agent-based models (ABMs have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1 efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2 conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  16. Treatment of visceral leishmaniasis: model-based analyses on the spread of antimony-resistant L. donovani in Bihar, India.

    Directory of Open Access Journals (Sweden)

    Anette Stauch

    Full Text Available BACKGROUND: Pentavalent antimonials have been the mainstay of antileishmanial therapy for decades, but increasing failure rates under antimonial treatment have challenged further use of these drugs in the Indian subcontinent. Experimental evidence has suggested that parasites which are resistant against antimonials have superior survival skills than sensitive ones even in the absence of antimonial treatment. METHODS AND FINDINGS: We use simulation studies based on a mathematical L. donovani transmission model to identify parameters which can explain why treatment failure rates under antimonial treatment increased up to 65% in Bihar between 1980 and 1997. Model analyses suggest that resistance to treatment alone cannot explain the observed treatment failure rates. We explore two hypotheses referring to an increased fitness of antimony-resistant parasites: the additional fitness is (i disease-related, by causing more clinical cases (higher pathogenicity or more severe disease (higher virulence, or (ii is transmission-related, by increasing the transmissibility from sand flies to humans or vice versa. CONCLUSIONS: Both hypotheses can potentially explain the Bihar observations. However, increased transmissibility as an explanation appears more plausible because it can occur in the background of asymptomatically transmitted infection whereas disease-related factors would most probably be observable. Irrespective of the cause of fitness, parasites with a higher fitness will finally replace sensitive parasites, even if antimonials are replaced by another drug.

  17. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Grid Mapping for Spatial Pattern Analyses of Recurrent Urban Traffic Congestion Based on Taxi GPS Sensing Data

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-03-01

    Full Text Available Traffic congestion is one of the most serious problems that impact urban transportation efficiency, especially in big cities. Identifying traffic congestion locations and occurring patterns is a prerequisite for urban transportation managers in order to take proper countermeasures for mitigating traffic congestion. In this study, the historical GPS sensing data of about 12,000 taxi floating cars in Beijing were used for pattern analyses of recurrent traffic congestion based on the grid mapping method. Through the use of ArcGIS software, 2D and 3D maps of the road network congestion were generated for traffic congestion pattern visualization. The study results showed that three types of traffic congestion patterns were identified, namely: point type, stemming from insufficient capacities at the nodes of the road network; line type, caused by high traffic demand or bottleneck issues in the road segments; and region type, resulting from multiple high-demand expressways merging and connecting to each other. The study illustrated that the proposed method would be effective for discovering traffic congestion locations and patterns and helpful for decision makers to take corresponding traffic engineering countermeasures in order to relieve the urban traffic congestion issues.

  19. Voxel-based analyses of gray/white matter volume and diffusion tensor data in major depression. Presidential award proceedings

    International Nuclear Information System (INIS)

    Abe, Osamu; Yamasue, Hidenori; Kasai, Kiyoto

    2008-01-01

    Previous neuroimaging studies have revealed that frontolimbic dysfunction may contribute to the pathophysiology of major depressive disorder. We used voxel-based analysis to simultaneously elucidate regional changes in gray/white matter volume, mean diffusivity (MD), and fractional anisotropy (FA) in the central nervous system of patients with unipolar major depression. We studied 21 right-handed patients and 42 age- and gender-matched right-handed normal subjects without central nervous system disorders. All image processing and statistical analyses were performed using SPM5 software. Local areas showing significant gray matter volume reduction in depressive patients compared with normal controls were observed in the right parahippocampal gyrus, hippocampus, bilateral middle frontal gyri, bilateral anterior cingulate cortices, left parietal and occipital lobes, and right superior temporal gyrus. Local areas showing increased mean diffusivity in depressive patients were observed in the bilateral parahippocampal gyri, hippocampus, pons, cerebellum, left frontal and temporal lobes, and right frontal lobe. There was no significant difference between the 2 groups for fractional anisotropy and white matter volume in the entire brain. Although there was no local area in which FA and MD were significantly correlated with disease severity, FA tended to correlate negatively with depression days (total accumulated days in depressive state) in the right anterior cingulate and the left frontal white matter (FDR-corrected P=0.055 for both areas). These results suggest that the frontolimbic neural circuit may play an important role in the neuropathology of patients with major depression. (author)

  20. Treatment algorithm based on the multivariate survival analyses in patients with advanced hepatocellular carcinoma treated with trans-arterial chemoembolization.

    Directory of Open Access Journals (Sweden)

    Hasmukh J Prajapati

    Full Text Available To develop the treatment algorithm from multivariate survival analyses (MVA in patients with Barcelona clinic liver cancer (BCLC C (advanced Hepatocellular carcinoma (HCC patients treated with Trans-arterial Chemoembolization (TACE.Consecutive unresectable and non-tranplantable patients with advanced HCC, who received DEB TACE were studied. A total of 238 patients (mean age, 62.4yrs was included in the study. Survivals were analyzed according to different parameters from the time of the 1st DEB TACE. Kaplan Meier and Cox Proportional Hazard model were used for survival analysis. The SS was constructed from MVA and named BCLC C HCC Prognostic (BCHP staging system (SS.Overall median survival (OS was 16.2 months. In HCC patients with venous thrombosis (VT of large vein [main portal vein (PV, right or left PV, hepatic vein, inferior vena cava] (22.7% versus small vein (segmental/subsegmental PV (9.7% versus no VT had OSs of 6.4 months versus 20 months versus 22.8 months respectively (p<0.001. On MVA, the significant independent prognostic factors (PFs of survival were CP class, eastern cooperative oncology group (ECOG performance status (PS, single HCC<5 cm, site of VT, metastases, serum creatinine and serum alpha-feto protein. Based on these PFs, the BCHP staging system was constructed. The OSs of stages I, II and III were 28.4 months, 11.8 months and 2.4 months accordingly (p<0.001. The treatment plan was proposed according to the different stages.On MVA of patients with advanced HCC treated with TACE, significant independent prognostic factors (PFs of survival were CP class, ECOG PS, single HCC<5 cm or others, site of VT, metastases, serum creatinine and serum alpha-feto protein. New BCHP SS was proposed based on MVA data to identify the suitable advanced HCC patients for TACE treatments.

  1. Detection of T790M, the acquired resistance EGFR mutation, by tumor biopsy versus noninvasive blood-based analyses

    Science.gov (United States)

    Sundaresan, Tilak K.; Sequist, Lecia V.; Heymach, John V.; Riely, Gregory J.; Jänne, Pasi A.; Koch, Walter H.; Sullivan, James P.; Fox, Douglas B.; Maher, Robert; Muzikansky, Alona; Webb, Andrew; Tran, Hai T.; Giri, Uma; Fleisher, Martin; Yu, Helena A.; Wei, Wen; Johnson, Bruce E.; Barber, Thomas A.; Walsh, John R.; Engelman, Jeffrey A.; Stott, Shannon L.; Kapur, Ravi; Maheswaran, Shyamala; Toner, Mehmet

    2015-01-01

    Purpose The T790M gatekeeper mutation in the Epidermal Growth Factor Receptor (EGFR) is acquired by some EGFR-mutant non-small cell lung cancers (NSCLC) as they become resistant to selective tyrosine kinase inhibitors (TKIs). As third generation EGFR TKIs that overcome T790M-associated resistance become available, noninvasive approaches to T790M detection will become critical to guide management. Experimental Design As part of a multi-institutional Stand-Up-To-Cancer collaboration, we performed an exploratory analysis of 40 patients with EGFR-mutant tumors progressing on EGFR TKI therapy. We compared the T790M genotype from tumor biopsies with analysis of simultaneously collected circulating tumor cells (CTC) and circulating tumor DNA (ctDNA). Results T790M genotypes were successfully obtained in 30 (75%) tumor biopsies, 28 (70%) CTC samples and 32 (80%) ctDNA samples. The resistance-associated mutation was detected in 47–50% of patients using each of the genotyping assays, with concordance among them ranging from 57–74%. While CTC- and ctDNA-based genotyping were each unsuccessful in 20–30% of cases, the two assays together enabled genotyping in all patients with an available blood sample, and they identified the T790M mutation in 14 (35%) patients in whom the concurrent biopsy was negative or indeterminate. Conclusion Discordant genotypes between tumor biopsy and blood-based analyses may result from technological differences, as well as sampling different tumor cell populations. The use of complementary approaches may provide the most complete assessment of each patient’s cancer, which should be validated in predicting response to T790M-targeted inhibitors. PMID:26446944

  2. Ultrastructure of spermatozoa of spider crabs, family Mithracidae (Crustacea, Decapoda, Brachyura): Integrative analyses based on morphological and molecular data.

    Science.gov (United States)

    Assugeni, Camila de O; Magalhães, Tatiana; Bolaños, Juan A; Tudge, Christopher C; Mantelatto, Fernando L; Zara, Fernando J

    2017-12-01

    Recent studies based on morphological and molecular data provide a new perspective concerning taxonomic aspects of the brachyuran family Mithracidae. These studies proposed a series of nominal changes and indicated that the family is actually represented by a different number and representatives of genera than previously thought. Here, we provide a comparative description of the ultrastructure of spermatozoa and spermatophores of some species of Mithracidae in a phylogenetic context. The ultrastructure of the spermatozoa and spermatophore was observed by scanning and transmission electron microscopy. The most informative morphological characters analysed were thickness of the operculum, shape of the perforatorial chamber and shape and thickness of the inner acrosomal zone. As a framework, we used a topology based on a phylogenetic analysis using mitochondrial data obtained here and from previous studies. Our results indicate that closely related species share a series of morphological characteristics of the spermatozoa. A thick operculum, for example, is a feature observed in species of the genera Amphithrax, Teleophrys, and Omalacantha in contrast to the slender operculum observed in Mithraculus and Mithrax. Amphithrax and Teleophrys have a rhomboid perforatorial chamber, while Mithraculus, Mithrax, and Omalacantha show a wider, deltoid morphology. Furthermore, our results are in agreement with recently proposed taxonomic changes including the separation of the genera Mithrax (previously Damithrax), Amphithrax (previously Mithrax) and Mithraculus, and the synonymy of Mithrax caribbaeus with Mithrax hispidus. Overall, the spermiotaxonomy of these species of Mithracidae represent a novel set of data that corroborates the most recent taxonomic revision of the family and can be used in future taxonomic and phylogenetic studies within this family. © 2017 Wiley Periodicals, Inc.

  3. Use of results of microbiological analyses for risk-based control of Listeria monocytogenes in marinated broiler legs.

    Science.gov (United States)

    Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura

    2008-02-10

    Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk

  4. In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy

    Data.gov (United States)

    U.S. Environmental Protection Agency — In situ analyses of Ag speciation in tissues of cucumber and wheat using synchrotron-based X-ray absorption spectroscopy showing spectral fitting and linear...

  5. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  6. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  7. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  8. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  9. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  10. Discrimination, correlation, and provenance of Bed I tephrostratigraphic markers, Olduvai Gorge, Tanzania, based on multivariate analyses of phenocryst compositions

    Science.gov (United States)

    Habermann, Jörg M.; McHenry, Lindsay J.; Stollhofen, Harald; Tolosana-Delgado, Raimon; Stanistreet, Ian G.; Deino, Alan L.

    2016-06-01

    The chronology of Pleistocene flora and fauna, including hominin remains and associated Oldowan industries in Bed I, Olduvai Gorge, Tanzania, is primarily based on 40Ar/39Ar dating of intercalated tuffs and lavas, combined with detailed tephrostratigraphic correlations within the basin. Although a high-resolution chronostratigraphic framework has been established for the eastern part of the Olduvai Basin, the western subbasin is less well known due in part to major lateral facies changes within Bed I combined with discontinuous exposure. We address these correlation difficulties using the discriminative power of the chemical composition of the major juvenile mineral phases (augite, anorthoclase, plagioclase) from tuffs, volcaniclastic sandstones, siliciclastic units, and lavas. We statistically evaluate these compositions, obtained from electron probe micro-analysis, applying principal component analysis and discriminant analysis to develop discriminant models that successfully classify most Bed I volcanic units. The correlations, resulting from integrated analyses of all target minerals, provide a basin-wide Bed I chemostratigraphic framework at high lateral and vertical resolution, consistent with the known geological context, that expands and refines the geochemical databases currently available. Correlation of proximal ignimbrites at the First Fault with medial and distal Lower Bed I successions of the western basin enables assessment of lateral facies and thickness trends that confirm Ngorongoro Volcano as the primary source for Lower Bed I, whereas Upper Bed I sediment supply is mainly from Olmoti Volcano. Compositional similarity between Tuff IA, Bed I lava, and Mafic Tuffs II and III single-grain fingerprints, together with north- and northwestward thinning of Bed I lava, suggests a common Ngorongoro source for these units. The techniques applied herein improve upon previous work by evaluating compositional affinities with statistical rigor rather than

  11. Palaeohydrology of the Southwest Yukon Territory, Canada, based on multiproxy analyses of lake sediment cores from a depth transect

    Science.gov (United States)

    Anderson, L.; Abbott, M.B.; Finney, B.P.; Edwards, M.E.

    2005-01-01

    Lake-level variations at Marcella Lake, a small, hydrologically closed lake in the southwestern Yukon Territory, document changes in effective moisture since the early Holocene. Former water levels, driven by regional palaeohydrology, were reconstructed by multiproxy analyses of sediment cores from four sites spanning shallow to deep water. Marcella Lake today is thermally stratified, being protected from wind by its position in a depression. It is alkaline and undergoes bio-induced calcification. Relative accumulations of calcium carbonate and organic matter at the sediment-water interface depend on the location of the depositional site relative to the thermocline. We relate lake-level fluctuations to down-core stratigraphic variations in composition, geochemistry, sedimentary structures and to the occurrence of unconformities in four cores based on observations of modern limnology and sedimentation processes. Twenty-four AMS radiocarbon dates on macrofossils and pollen provide the lake-level chronology. Prior to 10 000 cal. BP water levels were low, but then they rose to 3 to 4 m below modern levels. Between 7500 and 5000 cal. BP water levels were 5 to 6 m below modern but rose by 4000 cal. BP. Between 4000 and 2000 cal. BP they were higher than modern. During the last 2000 years, water levels were either near or 1 to 2 m below modern levels. Marcella Lake water-level fluctuations correspond with previously documented palaeoenvironmental and palaeoclimatic changes and provide new, independent effective moisture information. The improved geochronology and quantitative water-level estimates are a framework for more detailed studies in the southwest Yukon. ?? 2005 Edward Arnold (Publishers) Ltd.

  12. Comprehensive phylogenetic reconstruction of amoebozoa based on concatenated analyses of SSU-rDNA and actin genes.

    Directory of Open Access Journals (Sweden)

    Daniel J G Lahr

    Full Text Available Evolutionary relationships within Amoebozoa have been the subject of controversy for two reasons: 1 paucity of morphological characters in traditional surveys and 2 haphazard taxonomic sampling in modern molecular reconstructions. These along with other factors have prevented the erection of a definitive system that resolves confidently both higher and lower-level relationships. Additionally, the recent recognition that many protosteloid amoebae are in fact scattered throughout the Amoebozoa suggests that phylogenetic reconstructions have been excluding an extensive and integral group of organisms. Here we provide a comprehensive phylogenetic reconstruction based on 139 taxa using molecular information from both SSU-rDNA and actin genes. We provide molecular data for 13 of those taxa, 12 of which had not been previously characterized. We explored the dataset extensively by generating 18 alternative reconstructions that assess the effect of missing data, long-branched taxa, unstable taxa, fast evolving sites and inclusion of environmental sequences. We compared reconstructions with each other as well as against previously published phylogenies. Our analyses show that many of the morphologically established lower-level relationships (defined here as relationships roughly equivalent to Order level or below are congruent with molecular data. However, the data are insufficient to corroborate or reject the large majority of proposed higher-level relationships (above the Order-level, with the exception of Tubulinea, Archamoebae and Myxogastrea, which are consistently recovered. Moreover, contrary to previous expectations, the inclusion of available environmental sequences does not significantly improve the Amoebozoa reconstruction. This is probably because key amoebozoan taxa are not easily amplified by environmental sequencing methodology due to high rates of molecular evolution and regular occurrence of large indels and introns. Finally, in an effort

  13. Comprehensive phylogenetic reconstruction of amoebozoa based on concatenated analyses of SSU-rDNA and actin genes.

    Science.gov (United States)

    Lahr, Daniel J G; Grant, Jessica; Nguyen, Truc; Lin, Jian Hua; Katz, Laura A

    2011-01-01

    Evolutionary relationships within Amoebozoa have been the subject of controversy for two reasons: 1) paucity of morphological characters in traditional surveys and 2) haphazard taxonomic sampling in modern molecular reconstructions. These along with other factors have prevented the erection of a definitive system that resolves confidently both higher and lower-level relationships. Additionally, the recent recognition that many protosteloid amoebae are in fact scattered throughout the Amoebozoa suggests that phylogenetic reconstructions have been excluding an extensive and integral group of organisms. Here we provide a comprehensive phylogenetic reconstruction based on 139 taxa using molecular information from both SSU-rDNA and actin genes. We provide molecular data for 13 of those taxa, 12 of which had not been previously characterized. We explored the dataset extensively by generating 18 alternative reconstructions that assess the effect of missing data, long-branched taxa, unstable taxa, fast evolving sites and inclusion of environmental sequences. We compared reconstructions with each other as well as against previously published phylogenies. Our analyses show that many of the morphologically established lower-level relationships (defined here as relationships roughly equivalent to Order level or below) are congruent with molecular data. However, the data are insufficient to corroborate or reject the large majority of proposed higher-level relationships (above the Order-level), with the exception of Tubulinea, Archamoebae and Myxogastrea, which are consistently recovered. Moreover, contrary to previous expectations, the inclusion of available environmental sequences does not significantly improve the Amoebozoa reconstruction. This is probably because key amoebozoan taxa are not easily amplified by environmental sequencing methodology due to high rates of molecular evolution and regular occurrence of large indels and introns. Finally, in an effort to facilitate

  14. Subtypes of familial hemophagocytic lymphohistiocytosis in Japan based on genetic and functional analyses of cytotoxic T lymphocytes.

    Directory of Open Access Journals (Sweden)

    Kozo Nagai

    Full Text Available BACKGROUND: Familial hemophagocytic lymphohistiocytosis (FHL is a rare disease of infancy or early childhood. To clarify the incidence and subtypes of FHL in Japan, we performed genetic and functional analyses of cytotoxic T lymphocytes (CTLs in Japanese patients with FHL. DESIGN AND METHODS: Among the Japanese children with hemophagocytic lymphohistiocytosis (HLH registered at our laboratory, those with more than one of the following findings were eligible for study entry under a diagnosis of FHL: positive for known genetic mutations, a family history of HLH, and impaired CTL-mediated cytotoxicity. Mutations of the newly identified causative gene for FHL5, STXBP2, and the cytotoxicity and degranulation activity of CTLs in FHL patients, were analyzed. RESULTS: Among 31 FHL patients who satisfied the above criteria, PRF1 mutation was detected in 17 (FHL2 and UNC13D mutation was in 10 (FHL3. In 2 other patients, 3 novel mutations of STXBP2 gene were confirmed (FHL5. Finally, the remaining 2 were classified as having FHL with unknown genetic mutations. In all FHL patients, CTL-mediated cytotoxicity was low or deficient, and degranulation activity was also low or absent except FHL2 patients. In 2 patients with unknown genetic mutations, the cytotoxicity and degranulation activity of CTLs appeared to be deficient in one patient and moderately impaired in the other. CONCLUSIONS: FHL can be diagnosed and classified on the basis of CTL-mediated cytotoxicity, degranulation activity, and genetic analysis. Based on the data obtained from functional analysis of CTLs, other unknown gene(s responsible for FHL remain to be identified.

  15. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    International Nuclear Information System (INIS)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-01

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  16. Sorption data bases for argillaceous rocks and bentonite for the provisional safety analyses for SGT-E2

    Energy Technology Data Exchange (ETDEWEB)

    Baeyens, B.; Thoenen, T.; Bradbury, M. H.; Marques Fernandes, M.

    2014-11-15

    In Stage 1 of the Sectoral Plan for Deep Geological Repositories, four rock types have been identified as being suitable host rocks for a radioactive waste repository, namely, Opalinus Clay for a high-level (HLW) and a low- and intermediate-level (L/ILW) repository, and 'Brauner Dogger', Effingen Member and Helvetic Marls for a L/ILW repository. Sorption data bases (SDBs) for all of these host rocks are required for the provisional safety analyses, including all of the bounding porewater and mineralogical composition combinations. In addition, SDBs are needed for the rock formations lying below Opalinus Clay (lower confining units) and for the bentonite backfill in the HLW repository. In some previous work Bradbury et al. (2010) have described a methodology for developing sorption data bases for argillaceous rocks and compacted bentonite. The main factors influencing the sorption in such systems are the phyllosilicate mineral content, particular the 2:1 clay mineral content (illite/smectite/illite-smectite mixed layers) and the water chemistry which determines the radionuclide species in the aqueous phase. The source sorption data were taken predominantly from measurements on illite (or montmorillonite in the case of bentonite) and converted to the defined conditions in each system considered using a series of so called conversion factors to take into account differences in mineralogy, in pH and in radionuclide speciation. Finally, a Lab → Field conversion factor was applied to adapt sorption data measured in dispersed systems (batch experiments) to intact rock under in-situ conditions. This methodology to develop sorption data bases has been applied to the selected host rocks, lower confining units and compacted bentonite taking into account the mineralogical and porewater composition ranges defined. Confidence in the validity and correctness of this methodology has been built up through additional studies: (i) sorption values obtained in the manner

  17. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    Buck, John W.; McDonald, John P.; Taira, Randal Y.

    2002-01-01

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  18. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  19. Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study

    NARCIS (Netherlands)

    Vale, C.L.; Rydzewska, L.H.; Rovers, M.M.; Emberson, J.R.; Gueyffier, F.; Stewart, L.A.

    2015-01-01

    OBJECTIVE: To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines. DESIGN: Descriptive study. SETTING: Database maintained by the Cochrane IPD Meta-analysis

  20. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    Science.gov (United States)

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  1. Structural changes in Parkinson's disease: voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake.

    Science.gov (United States)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Yamaguchi, Hiroo; Kira, Jun-Ichi; Honda, Hiroshi

    2017-12-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p  90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p based morphometry can detect grey matter changes in Parkinson's disease. • Diffusion tensor imaging can detect white matter changes in Parkinson's disease.

  2. Identification and characterization of rock slope instabilities in Val Canaria (TI, Switzerland) based on field and DEM analyses

    Science.gov (United States)

    Ponzio, Maria; Pedrazzini, Andrea; Matasci, Battista; Jaboyedoff, Michel

    2013-04-01

    In Alpine areas rockslides and rock avalanches represent common gravitational hazards that potentially constitute a danger for people and infrastructures. The aim of this study is to characterize and understand the different factors influencing the distribution of large slope instabilities affecting the Val Canaria (southern Switzerland). In particular the importance of the tectonic and lithological settings as well as the impact of the groundwater circulations are investigated in detail. Val Canaria is a SW-NE trending lateral valley that displays potential large rock slope failure. Located just above one of the main N-S communication way (Highway, Railway) through the Alps, the development of large instabilities in the Val Canaria might have dramatic consequences for the main valley downstream. The dominant geological structure of the study area is the presence of a major tectonic boundary separating two basement nappes, constituted by gneissic lithologies, i.e. the Gotthard massif and the Lucomagno nappe that are located in the northern and southern part of the valley respectively. The basement units are separated by meta-sediments of Piora syncline composed by gypsum, dolomitic breccia and fractured calc-mica schists. Along with detailed geological mapping, the use of remote sensing techniques (Aerial and Terrestrial Laser Scanning) allows us to propose a multi-disciplinary approach that combines geological mapping and interpretation with periodic monitoring of the most active rockslide areas. A large array of TLS point cloud datasets (first acquisition in 2006) constitute a notable input, for monitoring purposes, and also for structural, rock mass characterization and failure mechanism interpretations. The analyses highlighted that both valley flanks are affected by deep-seated gravitational slope deformation covering a total area of about 8 km2 (corresponding to 40% of the catchment area). The most active area corresponds to the lower part of the valley

  3. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on {sup 123}I-MIBG uptake

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi [Kyushu University, Department of Clinical Radiology, Graduate School of Medical Sciences, Fukuoka (Japan); Yamaguchi, Hiroo; Kira, Jun-ichi [Kyushu University, Department of Neurology, Graduate School of Medical Sciences, Fukuoka (Japan)

    2017-12-15

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and {sup 123}I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and {sup 123}I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar {sup 123}I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of {sup 123}I-MIBG uptake in patients with PD. (orig.)

  4. Structural changes in Parkinson's disease. Voxel-based morphometry and diffusion tensor imaging analyses based on 123I-MIBG uptake

    International Nuclear Information System (INIS)

    Kikuchi, Kazufumi; Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Somehara, Ryo; Kamei, Ryotaro; Baba, Shingo; Honda, Hiroshi; Yamaguchi, Hiroo; Kira, Jun-ichi

    2017-01-01

    Patients with Parkinson's disease (PD) may exhibit symptoms of sympathetic dysfunction that can be measured using 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. We investigated the relationship between microstructural brain changes and 123 I-MIBG uptake in patients with PD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) analyses. This retrospective study included 24 patients with PD who underwent 3 T magnetic resonance imaging and 123 I-MIBG scintigraphy. They were divided into two groups: 12 MIBG-positive and 12 MIBG-negative cases (10 men and 14 women; age range: 60-81 years, corrected for gender and age). The heart/mediastinum count (H/M) ratio was calculated on anterior planar 123 I-MIBG images obtained 4 h post-injection. VBM and DTI were performed to detect structural differences between these two groups. Patients with low H/M ratio had significantly reduced brain volume at the right inferior frontal gyrus (uncorrected p < 0.0001, K > 90). Patients with low H/M ratios also exhibited significantly lower fractional anisotropy than those with high H/M ratios (p < 0.05) at the left anterior thalamic radiation, the left inferior fronto-occipital fasciculus, the left superior longitudinal fasciculus, and the left uncinate fasciculus. VBM and DTI may reveal microstructural changes related to the degree of 123 I-MIBG uptake in patients with PD. (orig.)

  5. Shinguards effective in preventing lower leg injuries in football: Population-based trend analyses over 25 years.

    Science.gov (United States)

    Vriend, Ingrid; Valkenberg, Huib; Schoots, Wim; Goudswaard, Gert Jan; van der Meulen, Wout J; Backx, Frank J G

    2015-09-01

    The majority of football injuries are caused by trauma to the lower extremities. Shinguards are considered an important measure in preventing lower leg impact abrasions, contusions and fractures. Given these benefits, Fédération Internationale de Football Association introduced the shinguard law in 1990, which made wearing shinguards during matches mandatory. This study evaluated the effect of the introduction of the shinguard law for amateur players in the Netherlands in the 1999/2000-football season on the incidence of lower leg injuries. Time trend analyses on injury data covering 25 years of continuous registration (1986-2010). Data were retrieved from a system that records all emergency department treatments in a random, representative sample of Dutch hospitals. All injuries sustained in football by patients aged 6-65 years were included, except for injuries of the Achilles tendon and Weber fractures. Time trends were analysed with multiple regression analyses; a model was fitted consisting of multiple straight lines, each representing a 5-year period. Patients were predominantly males (92%) and treated for fractures (48%) or abrasions/contusions (52%) to the lower leg. The incidence of lower leg football injuries decreased significantly following the introduction of the shinguard law (1996-2000: -20%; 2001-2005: -25%), whereas the incidence of all other football injuries did not. This effect was more prominent at weekends/match days. No gender differences were found. The results significantly show a preventive effect of the shinguard law underlining the relevance of rule changes as a preventive measure and wearing shinguards during both matches and training sessions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  6. Food web functioning of the benthopelagic community in a deep-sea seamount based on diet and stable isotope analyses

    Science.gov (United States)

    Preciado, Izaskun; Cartes, Joan E.; Punzón, Antonio; Frutos, Inmaculada; López-López, Lucía; Serrano, Alberto

    2017-03-01

    Trophic interactions in the deep-sea fish community of the Galicia Bank seamount (NE Atlantic) were inferred by using stomach contents analyses (SCA) and stable isotope analyses (SIA) of 27 fish species and their main prey items. Samples were collected during three surveys performed in 2009, 2010 and 2011 between 625 and 1800 m depth. Three main trophic guilds were determined using SCA data: pelagic, benthopelagic and benthic feeders, respectively. Vertically migrating macrozooplankton and meso-bathypelagic shrimps were identified to play a key role as pelagic prey for the deep sea fish community of the Galicia Bank. Habitat overlap was hardly detected; as a matter of fact, when species coexisted most of them evidenced a low dietary overlap, indicating a high degree of resource partitioning. A high potential competition, however, was observed among benthopelagic feeders, i.e.: Etmopterus spinax, Hoplostethus mediterraneus and Epigonus telescopus. A significant correlation was found between δ15N and δ13C for all the analysed species. When calculating Trophic Levels (TLs) for the main fish species, using both the SCA and SIA approaches, some discrepancies arose: TLs calculated from SIA were significantly higher than those obtained from SCA, probably indicating a higher consumption of benthic-suprabenthic prey in the previous months. During the summer, food web functioning in the Galicia Bank was more influenced by the assemblages dwelling in the water column than by deep-sea benthos, which was rather scarce in the summer samples. These discrepancies demonstrate the importance of using both approaches, SCA (snapshot of diet) and SIA (assimilated food in previous months), when attempting trophic studies, if an overview of food web dynamics in different compartments of the ecosystem is to be obtained.

  7. Carbon sources and trophic structure in an eelgrass (Zostera marina L.) bed based on stable isotope and fatty acid analyses

    OpenAIRE

    Jaschinski, Sybill; Brepohl, Daniela C.; Sommer, Ulrich

    2008-01-01

    Multiple stable isotope and fatty acid analyses were applied to examine food web dynamics in an eelgrass Zostera marina L. system in the western Baltic Sea. Samples of eelgrass, epiphytic algae, sand microflora, red algae, phytoplankton and main consumer species were collected in June 2002. delta C-13 values of primary producers ranged from -9.6%. for eelgrass to the most depleted value of -34.9%. for the most abundant red alga, Delesseria sanguinea, Epiphyte delta C-13 (-11.3 parts per thous...

  8. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette

    2016-01-01

    PURPOSE: An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream...

  9. Clustering structures of large proteins using multifractal analyses based on a 6-letter model and hydrophobicity scale of amino acids

    International Nuclear Information System (INIS)

    Yang Jianyi; Yu Zuguo; Anh, Vo

    2009-01-01

    The Schneider and Wrede hydrophobicity scale of amino acids and the 6-letter model of protein are proposed to study the relationship between the primary structure and the secondary structural classification of proteins. Two kinds of multifractal analyses are performed on the two measures obtained from these two kinds of data on large proteins. Nine parameters from the multifractal analyses are considered to construct the parameter spaces. Each protein is represented by one point in these spaces. A procedure is proposed to separate large proteins in the α, β, α + β and α/β structural classes in these parameter spaces. Fisher's linear discriminant algorithm is used to assess our clustering accuracy on the 49 selected large proteins. Numerical results indicate that the discriminant accuracies are satisfactory. In particular, they reach 100.00% and 84.21% in separating the α proteins from the {β, α + β, α/β} proteins in a parameter space; 92.86% and 86.96% in separating the β proteins from the {α + β, α/β} proteins in another parameter space; 91.67% and 83.33% in separating the α/β proteins from the α + β proteins in the last parameter space.

  10. Long-term fertilization alters chemically-separated soil organic carbon pools: Based on stable C isotope analyses

    Science.gov (United States)

    Dou, Xiaolin; He, Ping; Cheng, Xiaoli; Zhou, Wei

    2016-01-01

    Quantification of dynamics of soil organic carbon (SOC) pools under the influence of long-term fertilization is essential for predicting carbon (C) sequestration. We combined soil chemical fractionation with stable C isotope analyses to investigate the C dynamics of the various SOC pools after 25 years of fertilization. Five types of soil samples (0-20, 20-40 cm) including the initial level (CK) and four fertilization treatments (inorganic nitrogen fertilizer, IN; balanced inorganic fertilizer, NPK; inorganic fertilizer plus farmyard manure, MNPK; inorganic fertilizer plus corn straw residue, SNPK) were separated into recalcitrant and labile fractions, and the fractions were analysed for C content, C:N ratios, δ13C values, soil C and N recalcitrance indexes (RIC and RIN). Chemical fractionation showed long-term MNPK fertilization strongly increased the SOC storage in both soil layers (0-20 cm = 1492.4 gC m2 and 20-40 cm = 1770.6 gC m2) because of enhanced recalcitrant C (RC) and labile C (LC). The 25 years of inorganic fertilizer treatment did not increase the SOC storage mainly because of the offsetting effects of enhanced RC and decreased LC, whereas no clear SOC increases under the SNPK fertilization resulted from the fast decay rates of soil C.

  11. A geometric buckling expression for regular polygons: II. Analyses based on the multiple reciprocity boundary element method

    International Nuclear Information System (INIS)

    Itagaki, Masafumi; Miyoshi, Yoshinori; Hirose, Hideyuki

    1993-01-01

    A procedure is presented for the determination of geometric buckling for regular polygons. A new computation technique, the multiple reciprocity boundary element method (MRBEM), has been applied to solve the one-group neutron diffusion equation. The main difficulty in applying the ordinary boundary element method (BEM) to neutron diffusion problems has been the need to compute a domain integral, resulting from the fission source. The MRBEM has been developed for transforming this type of domain integral into an equivalent boundary integral. The basic idea of the MRBEM is to apply repeatedly the reciprocity theorem (Green's second formula) using a sequence of higher order fundamental solutions. The MRBEM requires discretization of the boundary only rather than of the domain. This advantage is useful for extensive survey analyses of buckling for complex geometries. The results of survey analyses have indicated that the general form of geometric buckling is B g 2 = (a n /R c ) 2 , where R c represents the radius of the circumscribed circle of the regular polygon under consideration. The geometric constant A n depends on the type of regular polygon and takes the value of π for a square and 2.405 for a circle, an extreme case that has an infinite number of sides. Values of a n for a triangle, pentagon, hexagon, and octagon have been calculated as 4.190, 2.281, 2.675, and 2.547, respectively

  12. Ancient DNA analyses of museum specimens from selected Presbytis (primate: Colobinae) based on partial Cyt b sequences

    Science.gov (United States)

    Aifat, N. R.; Yaakop, S.; Md-Zain, B. M.

    2016-11-01

    The IUCN Red List of Threatened Species has categorized Malaysian primates from being data deficient to critically endanger. Thus, ancient DNA analyses hold great potential to understand phylogeny, phylogeography and population history of extinct and extant species. Museum samples are one of the alternatives to provide important sources of biological materials for a large proportion of ancient DNA studies. In this study, a total of six museum skin samples from species Presbytis hosei (4 samples) and Presbytis frontata (2 samples), aged between 43 and 124 years old were extracted to obtain the DNA. Extraction was done by using QIAGEN QIAamp DNA Investigator Kit and the ability of this kit to extract museum skin samples was tested by amplification of partial Cyt b sequence using species-specific designed primer. Two primer pairs were designed specifically for P. hosei and P. frontata, respectively. These primer pairs proved to be efficient in amplifying 200bp of the targeted species in the optimized PCR conditions. The performance of the sequences were tested to determine genetic distance of genus Presbytis in Malaysia. From the analyses, P. hosei is closely related to P. chrysomelas and P. frontata with the value of 0.095 and 0.106, respectively. Cyt b gave a clear data in determining relationships among Bornean species. Thus, with the optimized condition, museum specimens can be used for molecular systematic studies of the Malaysian primates.

  13. Genomic analyses of tropical beef cattle fertility based on genotyping pools of Brahman cows with unknown pedigree.

    Science.gov (United States)

    Reverter, A; Porto-Neto, L R; Fortes, M R S; McCulloch, R; Lyons, R E; Moore, S; Nicol, D; Henshall, J; Lehnert, S A

    2016-10-01

    We introduce an innovative approach to lowering the overall cost of obtaining genomic EBV (GEBV) and encourage their use in commercial extensive herds of Brahman beef cattle. In our approach, the DNA genotyping of cow herds from 2 independent properties was performed using a high-density bovine SNP chip on DNA from pooled blood samples, grouped according to the result of a pregnancy test following their first and second joining opportunities. For the DNA pooling strategy, 15 to 28 blood samples from the same phenotype and contemporary group were allocated to pools. Across the 2 properties, a total of 183 pools were created representing 4,164 cows. In addition, blood samples from 309 bulls from the same properties were also taken. After genotyping and quality control, 74,584 remaining SNP were used for analyses. Pools and individual DNA samples were related by means of a "hybrid" genomic relationship matrix. The pooled genotyping analysis of 2 large and independent commercial populations of tropical beef cattle was able to recover significant and plausible associations between SNP and pregnancy test outcome. We discuss 24 SNP with significant association ( < 1.0 × 10) and mapped within 40 kb of an annotated gene. We have established a method to estimate the GEBV in young herd bulls for a trait that is currently unable to be predicted at all. In summary, our novel approach allowed us to conduct genomic analyses of fertility in 2 large commercial Brahman herds managed under extensive pastoral conditions.

  14. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    Science.gov (United States)

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent

  15. Analysing the Correlation between Social Network Analysis Measures and Performance of Students in Social Network-Based Engineering Education

    Science.gov (United States)

    Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav

    2016-01-01

    Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…

  16. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  17. A robust University-NGO partnership: Analysing school efficiencies in Bolivia with community-based management techniques

    Directory of Open Access Journals (Sweden)

    Joao Neiva de Figueiredo

    2013-09-01

    Full Text Available Community-based management research is a collaborative effort between management, academics and communities in need with the specific goal of achieving social change to foster social justice. Because it is designed to promote and validate joint methods of discovery and community-based sources of knowledge, community-based management research has several unique characteristics, which may affect its execution. This article describes the process of a community-based management research project which is descriptive in nature and uses quantitative techniques to examine school efficiencies in low-income communities in a developing country – Bolivia. The article describes the partnership between a US-based university and a Bolivian not-for-profit organisation, the research context and the history of the research project, including its various phases. It focuses on the (yet unpublished process of the community-based research as opposed to its content (which has been published elsewhere. The article also makes the case that the robust partnership between the US-based university and the Bolivian NGO has been a determining factor in achieving positive results. Strengths and limitations are examined in the hope that the experience may be helpful to others conducting descriptive quantitative management research using community-engaged frameworks in cross-cultural settings. Keywords: international partnership, community-engaged scholarship, education efficiency, multicultural low-income education.

  18. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    Directory of Open Access Journals (Sweden)

    Ester Vilaprinyo

    Full Text Available The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1 To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2 To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial, the starting ages (40, 45 and 50 years and the ending ages (69 and 74 years in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  19. A proposal of Fourier-Bessel expansion with optimized ensembles of bases to analyse two dimensional image

    Science.gov (United States)

    Yamasaki, K.; Fujisawa, A.; Nagashima, Y.

    2017-09-01

    It is a critical issue to find the best set of fitting function bases in mode structural analysis of two dimensional images like plasma emission profiles. The paper proposes a method to optimize a set of the bases in the case of Fourier-Bessel function series, using their orthonormal property, for more efficient and precise analysis. The method is applied on a tomography image of plasma emission obtained with the Maximum-likelihood expectation maximization method in a linear cylindrical device. The result demonstrates the excellency of the method that realizes the smaller residual error and minimum Akaike information criterion using smaller number of fitting function bases.

  20. Development of a model performance-based sign sheeting specification based on the evaluation of nighttime traffic signs using legibility and eye-tracker data : data and analyses.

    Science.gov (United States)

    2010-09-01

    This report presents data and technical analyses for Texas Department of Transportation Project 0-5235. This : project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to de...

  1. GT-WGS: an efficient and economic tool for large-scale WGS analyses based on the AWS cloud service.

    Science.gov (United States)

    Wang, Yiqi; Li, Gen; Ma, Mark; He, Fazhong; Song, Zhuo; Zhang, Wei; Wu, Chengkun

    2018-01-19

    Whole-genome sequencing (WGS) plays an increasingly important role in clinical practice and public health. Due to the big data size, WGS data analysis is usually compute-intensive and IO-intensive. Currently it usually takes 30 to 40 h to finish a 50× WGS analysis task, which is far from the ideal speed required by the industry. Furthermore, the high-end infrastructure required by WGS computing is costly in terms of time and money. In this paper, we aim to improve the time efficiency of WGS analysis and minimize the cost by elastic cloud computing. We developed a distributed system, GT-WGS, for large-scale WGS analyses utilizing the Amazon Web Services (AWS). Our system won the first prize on the Wind and Cloud challenge held by Genomics and Cloud Technology Alliance conference (GCTA) committee. The system makes full use of the dynamic pricing mechanism of AWS. We evaluate the performance of GT-WGS with a 55× WGS dataset (400GB fastq) provided by the GCTA 2017 competition. In the best case, it only took 18.4 min to finish the analysis and the AWS cost of the whole process is only 16.5 US dollars. The accuracy of GT-WGS is 99.9% consistent with that of the Genome Analysis Toolkit (GATK) best practice. We also evaluated the performance of GT-WGS performance on a real-world dataset provided by the XiangYa hospital, which consists of 5× whole-genome dataset with 500 samples, and on average GT-WGS managed to finish one 5× WGS analysis task in 2.4 min at a cost of $3.6. WGS is already playing an important role in guiding therapeutic intervention. However, its application is limited by the time cost and computing cost. GT-WGS excelled as an efficient and affordable WGS analyses tool to address this problem. The demo video and supplementary materials of GT-WGS can be accessed at https://github.com/Genetalks/wgs_analysis_demo .

  2. A systematic literature review on reviews and meta-analyses of biologically based CAM-practices for cancer patients

    DEFF Research Database (Denmark)

    Paludan-Müller, Christine; Lunde, Anita; Johannessen, Helle

    2010-01-01

    levels of evidence and were excluded from further evaluation. Among the 32 high-quality reviews the most reviewed practices were soy/plant hormones (7), Chinese herbal medicine (7), antioxidants (5) and mistletoe (4). Fifteen of the 32 reviews included data on the efficacy of biologically-based CAM......-practices against cancer, but none of the reviews concluded a positive effect on the cancer. Reviews including data on quality of life (10) and/or reduction of side effects (12) showed promising, but yet insufficient evidence for Chinese herbal medicine against pain  and side effects of chemotherapy, and mistletoe......Purpose To provide an overview and evaluate the evidence of biologically based CAM-practices for cancer patients. Methods Pubmed, Social Science Citation Index, AMED and the Cochrane library were systematically searched for reviews on effects of biologically based CAM-practices, including herbal...

  3. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Exploratory multinomial logit model-based driver injury severity analyses for teenage and adult drivers in intersection-related crashes.

    Science.gov (United States)

    Wu, Qiong; Zhang, Guohui; Ci, Yusheng; Wu, Lina; Tarefder, Rafiqul A; Alcántara, Adélamar Dely

    2016-05-18

    Teenage drivers are more likely to be involved in severely incapacitating and fatal crashes compared to adult drivers. Moreover, because two thirds of urban vehicle miles traveled are on signal-controlled roadways, significant research efforts are needed to investigate intersection-related teenage driver injury severities and their contributing factors in terms of driver behavior, vehicle-infrastructure interactions, environmental characteristics, roadway geometric features, and traffic compositions. Therefore, this study aims to explore the characteristic differences between teenage and adult drivers in intersection-related crashes, identify the significant contributing attributes, and analyze their impacts on driver injury severities. Using crash data collected in New Mexico from 2010 to 2011, 2 multinomial logit regression models were developed to analyze injury severities for teenage and adult drivers, respectively. Elasticity analyses and transferability tests were conducted to better understand the quantitative impacts of these factors and the teenage driver injury severity model's generality. The results showed that although many of the same contributing factors were found to be significant in the both teenage and adult driver models, certain different attributes must be distinguished to specifically develop effective safety solutions for the 2 driver groups. The research findings are helpful to better understand teenage crash uniqueness and develop cost-effective solutions to reduce intersection-related teenage injury severities and facilitate driver injury mitigation research.

  5. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  6. Are decisions using cost-utility analyses robust to choice of SF-36/SF-12 preference-based algorithm?

    Directory of Open Access Journals (Sweden)

    Walton Surrey M

    2005-03-01

    Full Text Available Abstract Background Cost utility analysis (CUA using SF-36/SF-12 data has been facilitated by the development of several preference-based algorithms. The purpose of this study was to illustrate how decision-making could be affected by the choice of preference-based algorithms for the SF-36 and SF-12, and provide some guidance on selecting an appropriate algorithm. Methods Two sets of data were used: (1 a clinical trial of adult asthma patients; and (2 a longitudinal study of post-stroke patients. Incremental costs were assumed to be $2000 per year over standard treatment, and QALY gains realized over a 1-year period. Ten published algorithms were identified, denoted by first author: Brazier (SF-36, Brazier (SF-12, Shmueli, Fryback, Lundberg, Nichol, Franks (3 algorithms, and Lawrence. Incremental cost-utility ratios (ICURs for each algorithm, stated in dollars per quality-adjusted life year ($/QALY, were ranked and compared between datasets. Results In the asthma patients, estimated ICURs ranged from Lawrence's SF-12 algorithm at $30,769/QALY (95% CI: 26,316 to 36,697 to Brazier's SF-36 algorithm at $63,492/QALY (95% CI: 48,780 to 83,333. ICURs for the stroke cohort varied slightly more dramatically. The MEPS-based algorithm by Franks et al. provided the lowest ICUR at $27,972/QALY (95% CI: 20,942 to 41,667. The Fryback and Shmueli algorithms provided ICURs that were greater than $50,000/QALY and did not have confidence intervals that overlapped with most of the other algorithms. The ICUR-based ranking of algorithms was strongly correlated between the asthma and stroke datasets (r = 0.60. Conclusion SF-36/SF-12 preference-based algorithms produced a wide range of ICURs that could potentially lead to different reimbursement decisions. Brazier's SF-36 and SF-12 algorithms have a strong methodological and theoretical basis and tended to generate relatively higher ICUR estimates, considerations that support a preference for these algorithms over the

  7. A multi-criteria evaluation system for marine litter pollution based on statistical analyses of OSPAR beach litter monitoring time series.

    Science.gov (United States)

    Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael

    2013-12-01

    During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. A Structural Equation Model to Analyse the Antecedents to Students' Web-Based Problem-Solving Performance

    Science.gov (United States)

    Hwang, Gwo-Jen; Kuo, Fan-Ray

    2015-01-01

    Web-based problem-solving, a compound ability of critical thinking, creative thinking, reasoning thinking and information-searching abilities, has been recognised as an important competence for elementary school students. Some researchers have reported the possible correlations between problem-solving competence and information searching ability;…

  9. Comparing direct image and wavelet transform-based approaches to analysing remote sensing imagery for predicting wildlife distribution

    NARCIS (Netherlands)

    Murwira, A.; Skidmore, A.K.

    2010-01-01

    In this study we tested the ability to predict the probability of elephant (Loxodonta africana) presence in an agricultural landscape of Zimbabwe based on three methods of measuring the spatial heterogeneity in vegetation cover, where vegetation cover was measured using the Landsat Thematic Mapper

  10. Implication of the cause of differences in 3D structures of proteins with high sequence identity based on analyses of amino acid sequences and 3D structures.

    Science.gov (United States)

    Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi

    2014-09-18

    Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.

  11. The effect of English-language restriction on systematic review-based meta-analyses: a systematic review of empirical studies.

    Science.gov (United States)

    Morrison, Andra; Polisena, Julie; Husereau, Don; Moulton, Kristen; Clark, Michelle; Fiander, Michelle; Mierzwinski-Urban, Monika; Clifford, Tammy; Hutton, Brian; Rabb, Danielle

    2012-04-01

    The English language is generally perceived to be the universal language of science. However, the exclusive reliance on English-language studies may not represent all of the evidence. Excluding languages other than English (LOE) may introduce a language bias and lead to erroneous conclusions. We conducted a comprehensive literature search using bibliographic databases and grey literature sources. Studies were eligible for inclusion if they measured the effect of excluding randomized controlled trials (RCTs) reported in LOE from systematic review-based meta-analyses (SR/MA) for one or more outcomes. None of the included studies found major differences between summary treatment effects in English-language restricted meta-analyses and LOE-inclusive meta-analyses. Findings differed about the methodological and reporting quality of trials reported in LOE. The precision of pooled estimates improved with the inclusion of LOE trials. Overall, we found no evidence of a systematic bias from the use of language restrictions in systematic review-based meta-analyses in conventional medicine. Further research is needed to determine the impact of language restriction on systematic reviews in particular fields of medicine.

  12. The affinities of Homo floresiensis based on phylogenetic analyses of cranial, dental, and postcranial characters.

    Science.gov (United States)

    Argue, Debbie; Groves, Colin P; Lee, Michael S Y; Jungers, William L

    2017-06-01

    Although the diminutive Homo floresiensis has been known for a decade, its phylogenetic status remains highly contentious. A broad range of potential explanations for the evolution of this species has been explored. One view is that H. floresiensis is derived from Asian Homo erectus that arrived on Flores and subsequently evolved a smaller body size, perhaps to survive the constrained resources they faced in a new island environment. Fossil remains of H. erectus, well known from Java, have not yet been discovered on Flores. The second hypothesis is that H. floresiensis is directly descended from an early Homo lineage with roots in Africa, such as Homo habilis; the third is that it is Homo sapiens with pathology. We use parsimony and Bayesian phylogenetic methods to test these hypotheses. Our phylogenetic data build upon those characters previously presented in support of these hypotheses by broadening the range of traits to include the crania, mandibles, dentition, and postcrania of Homo and Australopithecus. The new data and analyses support the hypothesis that H. floresiensis is an early Homo lineage: H. floresiensis is sister either to H. habilis alone or to a clade consisting of at least H. habilis, H. erectus, Homo ergaster, and H. sapiens. A close phylogenetic relationship between H. floresiensis and H. erectus or H. sapiens can be rejected; furthermore, most of the traits separating H. floresiensis from H. sapiens are not readily attributable to pathology (e.g., Down syndrome). The results suggest H. floresiensis is a long-surviving relict of an early (>1.75 Ma) hominin lineage and a hitherto unknown migration out of Africa, and not a recent derivative of either H. erectus or H. sapiens. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. GSHR, a Web-Based Platform Provides Gene Set-Level Analyses of Hormone Responses in Arabidopsis

    Directory of Open Access Journals (Sweden)

    Xiaojuan Ran

    2018-01-01

    Full Text Available Phytohormones regulate diverse aspects of plant growth and environmental responses. Recent high-throughput technologies have promoted a more comprehensive profiling of genes regulated by different hormones. However, these omics data generally result in large gene lists that make it challenging to interpret the data and extract insights into biological significance. With the rapid accumulation of theses large-scale experiments, especially the transcriptomic data available in public databases, a means of using this information to explore the transcriptional networks is needed. Different platforms have different architectures and designs, and even similar studies using the same platform may obtain data with large variances because of the highly dynamic and flexible effects of plant hormones; this makes it difficult to make comparisons across different studies and platforms. Here, we present a web server providing gene set-level analyses of Arabidopsis thaliana hormone responses. GSHR collected 333 RNA-seq and 1,205 microarray datasets from the Gene Expression Omnibus, characterizing transcriptomic changes in Arabidopsis in response to phytohormones including abscisic acid, auxin, brassinosteroids, cytokinins, ethylene, gibberellins, jasmonic acid, salicylic acid, and strigolactones. These data were further processed and organized into 1,368 gene sets regulated by different hormones or hormone-related factors. By comparing input gene lists to these gene sets, GSHR helped to identify gene sets from the input gene list regulated by different phytohormones or related factors. Together, GSHR links prior information regarding transcriptomic changes induced by hormones and related factors to newly generated data and facilities cross-study and cross-platform comparisons; this helps facilitate the mining of biologically significant information from large-scale datasets. The GSHR is freely available at http://bioinfo.sibs.ac.cn/GSHR/.

  14. Environmental risk factors for autism: an evidence-based review of systematic reviews and meta-analyses.

    Science.gov (United States)

    Modabbernia, Amirhossein; Velthorst, Eva; Reichenberg, Abraham

    2017-01-01

    According to recent evidence, up to 40-50% of variance in autism spectrum disorder (ASD) liability might be determined by environmental factors. In the present paper, we conducted a review of systematic reviews and meta-analyses of environmental risk factors for ASD. We assessed each review for quality of evidence and provided a brief overview of putative mechanisms of environmental risk factors for ASD. Current evidence suggests that several environmental factors including vaccination, maternal smoking, thimerosal exposure, and most likely assisted reproductive technologies are unrelated to risk of ASD. On the contrary, advanced parental age is associated with higher risk of ASD. Birth complications that are associated with trauma or ischemia and hypoxia have also shown strong links to ASD, whereas other pregnancy-related factors such as maternal obesity, maternal diabetes, and caesarian section have shown a less strong (but significant) association with risk of ASD. The reviews on nutritional elements have been inconclusive about the detrimental effects of deficiency in folic acid and omega 3, but vitamin D seems to be deficient in patients with ASD. The studies on toxic elements have been largely limited by their design, but there is enough evidence for the association between some heavy metals (most important inorganic mercury and lead) and ASD that warrants further investigation. Mechanisms of the association between environmental factors and ASD are debated but might include non-causative association (including confounding), gene-related effect, oxidative stress, inflammation, hypoxia/ischemia, endocrine disruption, neurotransmitter alterations, and interference with signaling pathways. Compared to genetic studies of ASD, studies of environmental risk factors are in their infancy and have significant methodological limitations. Future studies of ASD risk factors would benefit from a developmental psychopathology approach, prospective design, precise exposure

  15. Insights into SCP/TAPS proteins of liver flukes based on large-scale bioinformatic analyses of sequence datasets.

    Directory of Open Access Journals (Sweden)

    Cinzia Cantacessi

    Full Text Available BACKGROUND: SCP/TAPS proteins of parasitic helminths have been proposed to play key roles in fundamental biological processes linked to the invasion of and establishment in their mammalian host animals, such as the transition from free-living to parasitic stages and the modulation of host immune responses. Despite the evidence that SCP/TAPS proteins of parasitic nematodes are involved in host-parasite interactions, there is a paucity of information on this protein family for parasitic trematodes of socio-economic importance. METHODOLOGY/PRINCIPAL FINDINGS: We conducted the first large-scale study of SCP/TAPS proteins of a range of parasitic trematodes of both human and veterinary importance (including the liver flukes Clonorchis sinensis, Opisthorchis viverrini, Fasciola hepatica and F. gigantica as well as the blood flukes Schistosoma mansoni, S. japonicum and S. haematobium. We mined all current transcriptomic and/or genomic sequence datasets from public databases, predicted secondary structures of full-length protein sequences, undertook systematic phylogenetic analyses and investigated the differential transcription of SCP/TAPS genes in O. viverrini and F. hepatica, with an emphasis on those that are up-regulated in the developmental stages infecting the mammalian host. CONCLUSIONS: This work, which sheds new light on SCP/TAPS proteins, guides future structural and functional explorations of key SCP/TAPS molecules associated with diseases caused by flatworms. Future fundamental investigations of these molecules in parasites and the integration of structural and functional data could lead to new approaches for the control of parasitic diseases.

  16. Metagenome-based diversity analyses suggest a significant contribution of non-cyanobacterial lineages to carbonate precipitation in modern microbialites

    Directory of Open Access Journals (Sweden)

    Purificacion eLopez-Garcia

    2015-08-01

    Full Text Available Cyanobacteria are thought to play a key role in carbonate formation due to their metabolic activity, but other organisms carrying out oxygenic photosynthesis (photosynthetic eukaryotes or other metabolisms (e.g. anoxygenic photosynthesis, sulfate reduction, may also contribute to carbonate formation. To obtain more quantitative information than that provided by more classical PCR-dependent methods, we studied the microbial diversity of microbialites from the Alchichica crater lake (Mexico by mining for 16S/18S rRNA genes in metagenomes obtained by direct sequencing of environmental DNA. We studied samples collected at the Western (AL-W and Northern (AL-N shores of the lake and, at the latter site, along a depth gradient (1, 5, 10 and 15 m depth. The associated microbial communities were mainly composed of bacteria, most of which seemed heterotrophic, whereas archaea were negligible. Eukaryotes composed a relatively minor fraction dominated by photosynthetic lineages, diatoms in AL-W, influenced by Si-rich seepage waters, and green algae in AL-N samples. Members of the Gammaproteobacteria and Alphaproteobacteria classes of Proteobacteria, Cyanobacteria and Bacteroidetes were the most abundant bacterial taxa, followed by Planctomycetes, Deltaproteobacteria (Proteobacteria, Verrucomicrobia, Actinobacteria, Firmicutes and Chloroflexi. Community composition varied among sites and with depth. Although cyanobacteria were the most important bacterial group contributing to the carbonate precipitation potential, photosynthetic eukaryotes, anoxygenic photosynthesizers and sulfate reducers were also very abundant. Cyanobacteria affiliated to Pleurocapsales largely increased with depth. Scanning electron microscopy (SEM observations showed considerable areas of aragonite-encrusted Pleurocapsa-like cyanobacteria at microscale. Multivariate statistical analyses showed a strong positive correlation of Pleurocapsales and Chroococcales with aragonite formation at

  17. Chemical deamidation: a common pitfall in large-scale N-linked glycoproteomic mass spectrometry-based analyses

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Melo-Braga, Marcella Nunes; Engholm-Keller, Kasper

    2012-01-01

    for false positives. The confusion arises since the protein N-glycosidase F (PNGase F) reaction used to separate N-glycans from formerly glycosylated peptides catalyses the cleavage and deamidates the asparagine residue. This is typically viewed as beneficial since it acts to highlight the modification site......-linked consensus sites based on common N-linked glycoproteomics strategies without proper control experiments. Beside showing the spontaneous deamidation we provide alternative methods for validation that should be used in such experiments....

  18. Resting State Functional Connectivity in Mild Traumatic Brain Injury at the Acute Stage: Independent Component and Seed-Based Analyses

    Science.gov (United States)

    Iraji, Armin; Benson, Randall R.; Welch, Robert D.; O'Neil, Brian J.; Woodard, John L.; Imran Ayaz, Syed; Kulek, Andrew; Mika, Valerie; Medado, Patrick; Soltanian-Zadeh, Hamid; Liu, Tianming; Haacke, E. Mark

    2015-01-01

    Abstract Mild traumatic brain injury (mTBI) accounts for more than 1 million emergency visits each year. Most of the injured stay in the emergency department for a few hours and are discharged home without a specific follow-up plan because of their negative clinical structural imaging. Advanced magnetic resonance imaging (MRI), particularly functional MRI (fMRI), has been reported as being sensitive to functional disturbances after brain injury. In this study, a cohort of 12 patients with mTBI were prospectively recruited from the emergency department of our local Level-1 trauma center for an advanced MRI scan at the acute stage. Sixteen age- and sex-matched controls were also recruited for comparison. Both group-based and individual-based independent component analysis of resting-state fMRI (rsfMRI) demonstrated reduced functional connectivity in both posterior cingulate cortex (PCC) and precuneus regions in comparison with controls, which is part of the default mode network (DMN). Further seed-based analysis confirmed reduced functional connectivity in these two regions and also demonstrated increased connectivity between these regions and other regions of the brain in mTBI. Seed-based analysis using the thalamus, hippocampus, and amygdala regions further demonstrated increased functional connectivity between these regions and other regions of the brain, particularly in the frontal lobe, in mTBI. Our data demonstrate alterations of multiple brain networks at the resting state, particularly increased functional connectivity in the frontal lobe, in response to brain concussion at the acute stage. Resting-state functional connectivity of the DMN could serve as a potential biomarker for improved detection of mTBI in the acute setting. PMID:25285363

  19. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    Science.gov (United States)

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  20. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  1. Evidence for simvastatin anti-inflammatory actions based on quantitative analyses of NETosis and other inflammation/oxidation markers

    Science.gov (United States)

    Al-Ghoul, Walid M.; Kim, Margarita S.; Fazal, Nadeem; Azim, Anser C.; Ali, Ashraf

    2014-01-01

    Simvastatin (SMV) has been shown to exhibit promising anti-inflammatory properties alongside its classic cholesterol lowering action. We tested these emerging effects in a major thermal injury mouse model (3rd degree scald, ~20% TBSA) with previously documented, inflammation-mediated intestinal defects. Neutrophil extracellular traps (NETs) inflammation measurement methods were used alongside classic gut mucosa inflammation and leakiness measurements with exogenous melatonin treatment as a positive control. Our hypothesis is that simvastatin has protective therapeutic effects against early postburn gut mucosa inflammation and leakiness. To test this hypothesis, we compared untreated thermal injury (TI) adult male mice with TI littermates treated with simvastatin (0.2 mg/kg i.p., TI + SMV) immediately following burn injury and two hours before being sacrificed the day after; melatonin-treated (Mel) (1.86 mg/kg i.p., TI + Mel) mice were compared as a positive control. Mice were assessed for the following: (1) tissue oxidation and neutrophil infiltration in terminal ileum mucosa using classic carbonyl, Gr-1, and myeloperoxidase immunohistochemical or biochemical assays, (2) NETosis in terminal ileum and colon mucosa homogenates and peritoneal and fluid blood samples utilizing flow cytometric analyses of the surrogate NETosis biomarkers, picogreen and Gr-1, and (3) transepithelial gut leakiness as measured in terminal ileum and colon with FITC-dextran and transepithelial electrical resistance (TEER). Our results reveal that simvastatin and melatonin exhibit consistently comparable therapeutic protective effects against the following: (1) gut mucosa oxidative stress as revealed in the terminal ileum by markers of protein carbonylation as well as myeloperoxidase (MPO) and Gr-1 infiltration, (2) NETosis as revealed in the gut milieu, peritoneal lavage and plasma utilizing picogreen and Gr-1 flow cytometry and microscopy, and (3) transepithelial gut leakiness as

  2. Suicidality and aggression during antidepressant treatment: systematic review and meta-analyses based on clinical study reports.

    Science.gov (United States)

    Sharma, Tarang; Guski, Louise Schow; Freund, Nanna; Gøtzsche, Peter C

    2016-01-27

    To study serious harms associated with selective serotonin and serotonin-norepinephrine reuptake inhibitors.Design Systematic review and meta-analysis. Mortality and suicidality. Secondary outcomes were aggressive behaviour and akathisia. Clinical study reports for duloxetine, fluoxetine, paroxetine, sertraline, and venlafaxine obtained from the European and UK drug regulators, and summary trial reports for duloxetine and fluoxetine from Eli Lilly's website. Double blind placebo controlled trials that contained any patient narratives or individual patient listings of harms. Two researchers extracted data independently; the outcomes were meta-analysed by Peto's exact method (fixed effect model). We included 70 trials (64,381 pages of clinical study reports) with 18,526 patients. These trials had limitations in the study design and discrepancies in reporting, which may have led to serious under-reporting of harms. For example, some outcomes appeared only in individual patient listings in appendices, which we had for only 32 trials, and we did not have case report forms for any of the trials. Differences in mortality (all deaths were in adults, odds ratio 1.28, 95% confidence interval 0.40 to 4.06), suicidality (1.21, 0.84 to 1.74), and akathisia (2.04, 0.93 to 4.48) were not significant, whereas patients taking antidepressants displayed more aggressive behaviour (1.93, 1.26 to 2.95). For adults, the odds ratios were 0.81 (0.51 to 1.28) for suicidality, 1.09 (0.55 to 2.14) for aggression, and 2.00 (0.79 to 5.04) for akathisia. The corresponding values for children and adolescents were 2.39 (1.31 to 4.33), 2.79 (1.62 to 4.81), and 2.15 (0.48 to 9.65). In the summary trial reports on Eli Lilly's website, almost all deaths were noted, but all suicidal ideation events were missing, and the information on the remaining outcomes was incomplete. Because of the shortcomings identified and having only partial access to appendices with no access to case report forms, the harms

  3. Quantitative analyses of impurity silicon-carbide (SiC) and high-purity-titanium by neutron activation analyses based on k0-standardization method. Development of irradiation silicon technology in productivity using research reactor (Joint research)

    International Nuclear Information System (INIS)

    Motohashi, Jun; Takahashi, Hiroyuki; Magome, Hirokatsu; Sasajima, Fumio; Tokunaga, Okihiro; Kawasaki, Kozo; Onizawa, Koji; Isshiki, Masahiko

    2009-07-01

    JRR-3 and JRR-4 have been providing neutron-transmutation-doped silicon (NTD-Si) by using the silicon NTD process, which is a method to produce a high quality semiconductor. The domestic supply of NTD-Si is insufficient for the demand, and the market of NTD-Si is significantly growing at present. It is very important to increase achieve the production. To fulfill the requirement, we have been investigating a neutron filter, which is made of high-purity-titanium, for uniform doping. Silicon-carbide (SiC) semiconductor doped with NTD technology is considered suitable for high power devices with superior performances to conventional Si-based devices. We are very interested in the SiC as well. This report presents the results obtained after the impurity contents in the high-purity-titanium and SiC were analyzed by neutron activation analyses (NAA) using k 0 -standardization method. There were 6 and 9 impurity elements detected from the high-purity-titanium and SiC, respectively. Among those Sc from the high-purity-titanium and Fe from SiC were comparatively long half life nuclides. From the viewpoint of exposure in handling them, we need to examine the impurity control of materials. (author)

  4. Chemical and structural analyses of subsurface crevices formed during spontaneous deposition of cerium-based conversion coatings

    Energy Technology Data Exchange (ETDEWEB)

    Heller, Daimon K, E-mail: dkheller@mmm.com; Fahrenholtz, William G., E-mail: billf@mst.edu; O' Keefe, Matthew J., E-mail: mjokeefe@mst.edu

    2011-11-15

    Subsurface crevices formed during the deposition of cerium-based conversion coatings were analyzed in cross-section to assess the effect of deposition and post-treatment on the structure and chemistry of phases present. An Al-O containing phase, believed to be amorphous Al(OH){sub 3}, was formed in crevices during coating deposition. Analysis by energy dispersive X-ray spectroscopy revealed the presence of up to 1.6 at.% chlorine within the Al-O phase, which was likely a product of soluble chlorides that were present in the coating solution. Cerium was not detected within crevices. After post-treatment in an 85 deg. C aqueous phosphate solution, the chloride concentration was reduced to {<=} 0.30 at.% and electron diffraction of the Al-O phase produced ring patterns, indicating it had crystallized. Some diffraction patterns could be indexed to gibbsite (Al(OH){sub 3}), but others are believed to be a combination of hydrated aluminum hydroxides and/or oxides. Aluminum phosphate was not identified. Separately from its effect on cerium-based conversion coatings, phosphate post-treatment improved the corrosion resistance of Al 2024-T3 substrates by acting to crystallize Al(OH){sub 3} present on interior surfaces of crevices and by reducing the chloride concentration in this phase. - Highlights: {yields} Analysis of subsurface crevices formed during deposition of Ce-based conversion coatings. {yields} Phosphate post-treatment improved corrosion protection in salt spray testing. {yields} Post-treatment affected the composition and structure of regions within crevices. {yields} Crystallized Al(OH){sub 3} within crevices acted as a more effective barrier to chloride ions.

  5. Group-Level EEG-Processing Pipeline for Flexible Single Trial-Based Analyses Including Linear Mixed Models.

    Science.gov (United States)

    Frömer, Romy; Maier, Martin; Abdel Rahman, Rasha

    2018-01-01

    Here we present an application of an EEG processing pipeline customizing EEGLAB and FieldTrip functions, specifically optimized to flexibly analyze EEG data based on single trial information. The key component of our approach is to create a comprehensive 3-D EEG data structure including all trials and all participants maintaining the original order of recording. This allows straightforward access to subsets of the data based on any information available in a behavioral data structure matched with the EEG data (experimental conditions, but also performance indicators, such accuracy or RTs of single trials). In the present study we exploit this structure to compute linear mixed models (LMMs, using lmer in R) including random intercepts and slopes for items. This information can easily be read out from the matched behavioral data, whereas it might not be accessible in traditional ERP approaches without substantial effort. We further provide easily adaptable scripts for performing cluster-based permutation tests (as implemented in FieldTrip), as a more robust alternative to traditional omnibus ANOVAs. Our approach is particularly advantageous for data with parametric within-subject covariates (e.g., performance) and/or multiple complex stimuli (such as words, faces or objects) that vary in features affecting cognitive processes and ERPs (such as word frequency, salience or familiarity), which are sometimes hard to control experimentally or might themselves constitute variables of interest. The present dataset was recorded from 40 participants who performed a visual search task on previously unfamiliar objects, presented either visually intact or blurred. MATLAB as well as R scripts are provided that can be adapted to different datasets.

  6. Development of a Relap based Nuclear Plant Analyser with 3-D graphics using OpenGL and Object Relap

    International Nuclear Information System (INIS)

    Lee, Young Jin

    2010-01-01

    A 3-D Graphic Nuclear Plant Analyzer (NPA) program was developed using GLScene and the TRelap. GLScene is an OpenGL based 3D graphics library for the Delphi object-oriented program language, and it implements the OpenGL functions in forms suitable for programming with Delphi. TRelap is an object wrapper developed by the author to easily implement the Relap5 thermal hydraulic code under object oriented programming environment. The 3-D Graphic NPA was developed to demonstrate the superiority of the object oriented programming approach in developing complex programs

  7. SARAPAN-A simulated-annealing-based tool to generate random patterned-channel-age in CANDU fuel management analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kastanya, Doddy [Safety and Licensing Department, Candesco Division of Kinectrics Inc., Toronto (Canada)

    2017-02-15

    In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium) utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP) code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  8. MRI-based Brain Healthcare Quotients: A bridge between neural and behavioral analyses for keeping the brain healthy.

    Directory of Open Access Journals (Sweden)

    Kiyotaka Nemoto

    Full Text Available Neurological and psychiatric disorders are a burden on social and economic resources. Therefore, maintaining brain health and preventing these disorders are important. While the physiological functions of the brain are well studied, few studies have focused on keeping the brain healthy from a neuroscientific viewpoint. We propose a magnetic resonance imaging (MRI-based quotient for monitoring brain health, the Brain Healthcare Quotient (BHQ, which is based on the volume of gray matter (GM and the fractional anisotropy (FA of white matter (WM. We recruited 144 healthy adults to acquire structural neuroimaging data, including T1-weighted images and diffusion tensor images, and data associated with both physical (BMI, blood pressure, and daily time use and social (subjective socioeconomic status, subjective well-being, post-materialism and Epicureanism factors. We confirmed that the BHQ was sensitive to an age-related decline in GM volume and WM integrity. Further analysis revealed that the BHQ was critically affected by both physical and social factors. We believe that our BHQ is a simple yet highly sensitive, valid measure for brain health research that will bridge the needs of the scientific community and society and help us lead better lives in which we stay healthy, active, and sharp.

  9. SARAPAN—A Simulated-Annealing-Based Tool to Generate Random Patterned-Channel-Age in CANDU Fuel Management Analyses

    Directory of Open Access Journals (Sweden)

    Doddy Kastanya

    2017-02-01

    Full Text Available In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  10. The influence of sampling unit size and spatial arrangement patterns on neighborhood-based spatial structure analyses of forest stands

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H.; Zhang, G.; Hui, G.; Li, Y.; Hu, Y.; Zhao, Z.

    2016-07-01

    Aim of study: Neighborhood-based stand spatial structure parameters can quantify and characterize forest spatial structure effectively. How these neighborhood-based structure parameters are influenced by the selection of different numbers of nearest-neighbor trees is unclear, and there is some disagreement in the literature regarding the appropriate number of nearest-neighbor trees to sample around reference trees. Understanding how to efficiently characterize forest structure is critical for forest management. Area of study: Multi-species uneven-aged forests of Northern China. Material and methods: We simulated stands with different spatial structural characteristics and systematically compared their structure parameters when two to eight neighboring trees were selected. Main results: Results showed that values of uniform angle index calculated in the same stand were different with different sizes of structure unit. When tree species and sizes were completely randomly interspersed, different numbers of neighbors had little influence on mingling and dominance indices. Changes of mingling or dominance indices caused by different numbers of neighbors occurred when the tree species or size classes were not randomly interspersed and their changing characteristics can be detected according to the spatial arrangement patterns of tree species and sizes. Research highlights: The number of neighboring trees selected for analyzing stand spatial structure parameters should be fixed. We proposed that the four-tree structure unit is the best compromise between sampling accuracy and costs for practical forest management. (Author)

  11. MRI-based Brain Healthcare Quotients: A bridge between neural and behavioral analyses for keeping the brain healthy.

    Science.gov (United States)

    Nemoto, Kiyotaka; Oka, Hiroki; Fukuda, Hiroki; Yamakawa, Yoshinori

    2017-01-01

    Neurological and psychiatric disorders are a burden on social and economic resources. Therefore, maintaining brain health and preventing these disorders are important. While the physiological functions of the brain are well studied, few studies have focused on keeping the brain healthy from a neuroscientific viewpoint. We propose a magnetic resonance imaging (MRI)-based quotient for monitoring brain health, the Brain Healthcare Quotient (BHQ), which is based on the volume of gray matter (GM) and the fractional anisotropy (FA) of white matter (WM). We recruited 144 healthy adults to acquire structural neuroimaging data, including T1-weighted images and diffusion tensor images, and data associated with both physical (BMI, blood pressure, and daily time use) and social (subjective socioeconomic status, subjective well-being, post-materialism and Epicureanism) factors. We confirmed that the BHQ was sensitive to an age-related decline in GM volume and WM integrity. Further analysis revealed that the BHQ was critically affected by both physical and social factors. We believe that our BHQ is a simple yet highly sensitive, valid measure for brain health research that will bridge the needs of the scientific community and society and help us lead better lives in which we stay healthy, active, and sharp.

  12. Design for tsunami barrier wall based on numerical analyses of tsunami inundation at Shimane Nuclear Power Plant

    International Nuclear Information System (INIS)

    Kiyoshige, Naoya; Yoshitsugu, Shinich; Kawahara, Kazufumi; Ookubo, Yoshimi; Nishihata, Takeshi; Ino, Hitoshi; Kotoura, Tsuyoshi

    2014-01-01

    The conventional tsunami assessment of the active fault beneath the Japan sea in front of the Shimane nuclear power plant and the earthquake feared to happen at the eastern margin of the Japan sea does not expect a huge tsunami as to be assumed on the Pacific sea coast. Hence, the huge tsunami observed at the power plant located near the source of the Tohoku Pacific sea earthquake tsunami whose run-up height reached TP+15m is regarded as the level 2 tsunami for the Shimane nuclear power plant and planned to construct the tsunami barrier walls to endure the supposed level 2 tsunami. In this study, the setting of the Level 2 tsunami by using the numerical analysis based on the non-linear shallow water theory and evaluation for the design tsunami wave pressure exerted on the counter measures by using CADMAS-SURF/3D are discussed. The designed tsunami barrier walls which are suitable to the power plant feasibility and decided from the design tsunami wave pressure distribution based on Tanimoto's formulae and standard earthquake ground motion Ss are also addressed. (author)

  13. On-site phytoremediation applicability assessment in Alur Ilmu, Universiti Kebangsaan Malaysia based on spatial and pollution removal analyses.

    Science.gov (United States)

    Mahmud, Mohd Hafiyyan; Lee, Khai Ern; Goh, Thian Lai

    2017-10-01

    The present paper aims to assess the phytoremediation performance based on pollution removal efficiency of the highly polluted region of Alur Ilmu urban river for its applicability of on-site treatment. Thirteen stations along Alur Ilmu were selected to produce thematic maps through spatial distribution analysis based on six water quality parameters of Malaysia's Water Quality Index (WQI) for dry and raining seasons. The maps generated were used to identify the highly polluted region for phytoremediation applicability assessment. Four free-floating plants were tested in treating water samples from the highly polluted region under three different conditions, namely controlled, aerated and normal treatments. The selected free-floating plants were water hyacinth (Eichhornia crassipes), water lettuce (Pistia stratiotes), rose water lettuce (Pistia sp.) and pennywort (Centella asiatica). The results showed that Alur Ilmu was more polluted during dry season compared to raining season based on the water quality analysis. During dry season, four parameters were marked as polluted along Alur Ilmu, namely dissolve oxygen (DO), 4.72 mg/L (class III); ammoniacal nitrogen (NH 3 -N), 0.85 mg/L (class IV); total suspended solid (TSS), 402 mg/L (class V) and biological oxygen demand (BOD), 3.89 mg/L (class III), whereas, two parameters were classed as polluted during raining season, namely total suspended solid (TSS), 571 mg/L (class V) and biological oxygen demand (BOD), 4.01 mg/L (class III). The thematic maps generated from spatial distribution analysis using Kriging gridding method showed that the highly polluted region was recorded at station AL 5. Hence, water samples were taken from this station for pollution removal analysis. All the free-floating plants were able to reduce TSS and COD in less than 14 days. However, water hyacinth showed the least detrimental effect from the phytoremediation process compared to other free-floating plants, thus made it a suitable

  14. Ein empirischer Vergleich der Prozessaufzeichnungsmethoden Mouselab und Eyetracking bei Präferenzmessungen mittels Choice-based Conjoint Analyse

    DEFF Research Database (Denmark)

    Meissner, Martin; Decker, Reinhold; Pfeiffer, Jella

    2010-01-01

    , instead of eye tracking, influences the way information is acquired and processed and whether respectively how this affects the validity of CBC results. The empirical study shows that Mouselab in fact changes the information acquisition process, but this does not affect the quality of the preference......In choice-based conjoint (CBC) analysis respondents’ decisions in choice settings are used to determine relevant attributes and attribute levels of the products considered. Yet, the cognitive process preceding the choice decision is usually ignored. The eye tracking technique can be used to gain...... additional insights on how information is processed in the context of preference measurement. However, because of the technical requirements, an efficient application of eye tracking is often hard to realize in managerial practice. Therefore, the use of alternative process-tracing approaches like Mouselab...

  15. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  16. SEDIMENTATION AND DEPOSITIONAL ENVIRONMENT BASED ON SEISMIC AND DRILLING CORE ANALYSES IN CIMANUK DELTA INDRAMAYU, WEST JAVA

    Directory of Open Access Journals (Sweden)

    I Nyoman Astawa

    2017-07-01

    Full Text Available Core drilling had been carried out in three locations such as in Brondong Village (BH-01, Pasekan Village (BH-02, and Karangsong Village (BH-03. Those three cores are similar in lithology consist of clay. They are correlated based on fragment content, such as fine sand lenses, mollusk shells, rock and carbonate materials which discovered from different depths. Single side band of shallow seismic reflection recorded paleochannels in E sequence at the north and the west of investigated area. It’s predicted the north paleo channels were part of Lawas River or Tegar River, while the west paleo channels were part of Rambatan Lama River. Microfauna content of all those three cores indicated that from the depth of 0.00 meter down to 25,00 meters are Holocene/Recent, from 25,00 meters to the bottom are Pleistocene which were deposited in the bay to middle neritic environment.

  17. Estimation of time-variable fast flow path chemical concentrations for application in tracer-based hydrograph separation analyses

    Science.gov (United States)

    Kronholm, Scott C.; Capel, Paul D.

    2016-01-01

    Mixing models are a commonly used method for hydrograph separation, but can be hindered by the subjective choice of the end-member tracer concentrations. This work tests a new variant of mixing model that uses high-frequency measures of two tracers and streamflow to separate total streamflow into water from slowflow and fastflow sources. The ratio between the concentrations of the two tracers is used to create a time-variable estimate of the concentration of each tracer in the fastflow end-member. Multiple synthetic data sets, and data from two hydrologically diverse streams, are used to test the performance and limitations of the new model (two-tracer ratio-based mixing model: TRaMM). When applied to the synthetic streams under many different scenarios, the TRaMM produces results that were reasonable approximations of the actual values of fastflow discharge (±0.1% of maximum fastflow) and fastflow tracer concentrations (±9.5% and ±16% of maximum fastflow nitrate concentration and specific conductance, respectively). With real stream data, the TRaMM produces high-frequency estimates of slowflow and fastflow discharge that align with expectations for each stream based on their respective hydrologic settings. The use of two tracers with the TRaMM provides an innovative and objective approach for estimating high-frequency fastflow concentrations and contributions of fastflow water to the stream. This provides useful information for tracking chemical movement to streams and allows for better selection and implementation of water quality management strategies.

  18. Flood Mapping and Flood Dynamics of the Mekong Delta: ENVISAT-ASAR-WSM Based Time Series Analyses

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2013-02-01

    Full Text Available Satellite remote sensing is a valuable tool for monitoring flooding. Microwave sensors are especially appropriate instruments, as they allow the differentiation of inundated from non-inundated areas, regardless of levels of solar illumination or frequency of cloud cover in regions experiencing substantial rainy seasons. In the current study we present the longest synthetic aperture radar-based time series of flood and inundation information derived for the Mekong Delta that has been analyzed for this region so far. We employed overall 60 Envisat ASAR Wide Swath Mode data sets at a spatial resolution of 150 meters acquired during the years 2007–2011 to facilitate a thorough understanding of the flood regime in the Mekong Delta. The Mekong Delta in southern Vietnam comprises 13 provinces and is home to 18 million inhabitants. Extreme dry seasons from late December to May and wet seasons from June to December characterize people’s rural life. In this study, we show which areas of the delta are frequently affected by floods and which regions remain dry all year round. Furthermore, we present which areas are flooded at which frequency and elucidate the patterns of flood progression over the course of the rainy season. In this context, we also examine the impact of dykes on floodwater emergence and assess the relationship between retrieved flood occurrence patterns and land use. In addition, the advantages and shortcomings of ENVISAT ASAR-WSM based flood mapping are discussed. The results contribute to a comprehensive understanding of Mekong Delta flood dynamics in an environment where the flow regime is influenced by the Mekong River, overland water-flow, anthropogenic floodwater control, as well as the tides.

  19. Coalescent-Based Analyses of Genomic Sequence Data Provide a Robust Resolution of Phylogenetic Relationships among Major Groups of Gibbons

    Science.gov (United States)

    Shi, Cheng-Min; Yang, Ziheng

    2018-01-01

    Abstract The phylogenetic relationships among extant gibbon species remain unresolved despite numerous efforts using morphological, behavorial, and genetic data and the sequencing of whole genomes. A major challenge in reconstructing the gibbon phylogeny is the radiative speciation process, which resulted in extremely short internal branches in the species phylogeny and extensive incomplete lineage sorting with extensive gene-tree heterogeneity across the genome. Here, we analyze two genomic-scale data sets, with ∼10,000 putative noncoding and exonic loci, respectively, to estimate the species tree for the major groups of gibbons. We used the Bayesian full-likelihood method bpp under the multispecies coalescent model, which naturally accommodates incomplete lineage sorting and uncertainties in the gene trees. For comparison, we included three heuristic coalescent-based methods (mp-est, SVDQuartets, and astral) as well as concatenation. From both data sets, we infer the phylogeny for the four extant gibbon genera to be (Hylobates, (Nomascus, (Hoolock, Symphalangus))). We used simulation guided by the real data to evaluate the accuracy of the methods used. Astral, while not as efficient as bpp, performed well in estimation of the species tree even in presence of excessive incomplete lineage sorting. Concatenation, mp-est and SVDQuartets were unreliable when the species tree contains very short internal branches. Likelihood ratio test of gene flow suggests a small amount of migration from Hylobates moloch to H. pileatus, while cross-genera migration is absent or rare. Our results highlight the utility of coalescent-based methods in addressing challenging species tree problems characterized by short internal branches and rampant gene tree-species tree discordance. PMID:29087487

  20. Hydraulic characterisation of iron-oxide-coated sand and gravel based on nuclear magnetic resonance relaxation mode analyses

    Directory of Open Access Journals (Sweden)

    S. Costabel

    2018-03-01

    Full Text Available The capability of nuclear magnetic resonance (NMR relaxometry to characterise hydraulic properties of iron-oxide-coated sand and gravel was evaluated in a laboratory study. Past studies have shown that the presence of paramagnetic iron oxides and large pores in coarse sand and gravel disturbs the otherwise linear relationship between relaxation time and pore size. Consequently, the commonly applied empirical approaches fail when deriving hydraulic quantities from NMR parameters. Recent research demonstrates that higher relaxation modes must be taken into account to relate the size of a large pore to its NMR relaxation behaviour in the presence of significant paramagnetic impurities at its pore wall. We performed NMR relaxation experiments with water-saturated natural and reworked sands and gravels, coated with natural and synthetic ferric oxides (goethite, ferrihydrite, and show that the impact of the higher relaxation modes increases significantly with increasing iron content. Since the investigated materials exhibit narrow pore size distributions, and can thus be described by a virtual bundle of capillaries with identical apparent pore radius, recently presented inversion approaches allow for estimation of a unique solution yielding the apparent capillary radius from the NMR data. We found the NMR-based apparent radii to correspond well to the effective hydraulic radii estimated from the grain size distributions of the samples for the entire range of observed iron contents. Consequently, they can be used to estimate the hydraulic conductivity using the well-known Kozeny–Carman equation without any calibration that is otherwise necessary when predicting hydraulic conductivities from NMR data. Our future research will focus on the development of relaxation time models that consider pore size distributions. Furthermore, we plan to establish a measurement system based on borehole NMR for localising iron clogging and controlling its remediation

  1. Hydraulic characterisation of iron-oxide-coated sand and gravel based on nuclear magnetic resonance relaxation mode analyses

    Science.gov (United States)

    Costabel, Stephan; Weidner, Christoph; Müller-Petke, Mike; Houben, Georg

    2018-03-01

    The capability of nuclear magnetic resonance (NMR) relaxometry to characterise hydraulic properties of iron-oxide-coated sand and gravel was evaluated in a laboratory study. Past studies have shown that the presence of paramagnetic iron oxides and large pores in coarse sand and gravel disturbs the otherwise linear relationship between relaxation time and pore size. Consequently, the commonly applied empirical approaches fail when deriving hydraulic quantities from NMR parameters. Recent research demonstrates that higher relaxation modes must be taken into account to relate the size of a large pore to its NMR relaxation behaviour in the presence of significant paramagnetic impurities at its pore wall. We performed NMR relaxation experiments with water-saturated natural and reworked sands and gravels, coated with natural and synthetic ferric oxides (goethite, ferrihydrite), and show that the impact of the higher relaxation modes increases significantly with increasing iron content. Since the investigated materials exhibit narrow pore size distributions, and can thus be described by a virtual bundle of capillaries with identical apparent pore radius, recently presented inversion approaches allow for estimation of a unique solution yielding the apparent capillary radius from the NMR data. We found the NMR-based apparent radii to correspond well to the effective hydraulic radii estimated from the grain size distributions of the samples for the entire range of observed iron contents. Consequently, they can be used to estimate the hydraulic conductivity using the well-known Kozeny-Carman equation without any calibration that is otherwise necessary when predicting hydraulic conductivities from NMR data. Our future research will focus on the development of relaxation time models that consider pore size distributions. Furthermore, we plan to establish a measurement system based on borehole NMR for localising iron clogging and controlling its remediation in the gravel pack of

  2. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  3. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  4. Two Model-Based Methods for Policy Analyses of Fine Particulate Matter Control in China: Source Apportionment and Source Sensitivity

    Science.gov (United States)

    Li, X.; Zhang, Y.; Zheng, B.; Zhang, Q.; He, K.

    2013-12-01

    Anthropogenic emissions have been controlled in recent years in China to mitigate fine particulate matter (PM2.5) pollution. Recent studies show that sulfate dioxide (SO2)-only control cannot reduce total PM2.5 levels efficiently. Other species such as nitrogen oxide, ammonia, black carbon, and organic carbon may be equally important during particular seasons. Furthermore, each species is emitted from several anthropogenic sectors (e.g., industry, power plant, transportation, residential and agriculture). On the other hand, contribution of one emission sector to PM2.5 represents contributions of all species in this sector. In this work, two model-based methods are used to identify the most influential emission sectors and areas to PM2.5. The first method is the source apportionment (SA) based on the Particulate Source Apportionment Technology (PSAT) available in the Comprehensive Air Quality Model with extensions (CAMx) driven by meteorological predictions of the Weather Research and Forecast (WRF) model. The second method is the source sensitivity (SS) based on an adjoint integration technique (AIT) available in the GEOS-Chem model. The SA method attributes simulated PM2.5 concentrations to each emission group, while the SS method calculates their sensitivity to each emission group, accounting for the non-linear relationship between PM2.5 and its precursors. Despite their differences, the complementary nature of the two methods enables a complete analysis of source-receptor relationships to support emission control policies. Our objectives are to quantify the contributions of each emission group/area to PM2.5 in the receptor areas and to intercompare results from the two methods to gain a comprehensive understanding of the role of emission sources in PM2.5 formation. The results will be compared in terms of the magnitudes and rankings of SS or SA of emitted species and emission groups/areas. GEOS-Chem with AIT is applied over East Asia at a horizontal grid

  5. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    Science.gov (United States)

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were

  6. Preliminary assessment of late quaternary vegetation and climate of southeastern Utah based on analyses of packrat middens

    International Nuclear Information System (INIS)

    Betancourt, J.L.; Biggar, N.

    1985-06-01

    Packrat midden sequences from two caves (elevations 1585 and 2195 m; 5200 and 7200 ft) southwest of the Abajo Mountains in southeast Utah record vegetation changes that are attributed to climatic changes occurring during the last 13,000 years. These data are useful in assessing potential future climates at proposed nuclear waste sites in the area. Paleoclimates are reconstructed by defining modern elevational analogs for the vegetation assemblages identified in the middens. Based on the midden record, a climate most extreme from the present occurred prior to approximately 10,000 years before present (BP), when mean annual temperature was probably 3 to 4C (5.5 to 7F) cooler than present. However, cooling could not have exceeded 5C (9F) at 1585 m (5200 ft). Accompanying mean annual precipitation is estimated to have been from 35 to 140% greater than at present, with rainfall concentrated in the winter months. Vegetational changes beginning approximately 10,000 years BP are attributed to increased summer and mean annual temperatures, a decreasing frequency of spring freezes, and a shift from winter- to summer-dominant rainfall. Greater effective moisture than present is inferred at both cave sites from approximately 8000 to 4000 years BP. Modern flora was present at both sites by about 2000 years BP

  7. NMR-based metabonomic analyses of the effects of ultrasmall superparamagnetic particles of iron oxide (USPIO) on macrophage metabolism

    Science.gov (United States)

    Feng, Jianghua; Zhao, Jing; Hao, Fuhua; Chen, Chang; Bhakoo, Kishore; Tang, Huiru

    2011-05-01

    The metabonomic changes in murine RAW264.7 macrophage-like cell line induced by ultrasmall superparamagnetic particles of iron oxides (USPIO) have been investigated, by analyzing both the cells and culture media, using high-resolution NMR in conjunction with multivariate statistical methods. Upon treatment with USPIO, macrophage cells showed a significant decrease in the levels of triglycerides, essential amino acids such as valine, isoleucine, and choline metabolites together with an increase of glycerophospholipids, tyrosine, phenylalanine, lysine, glycine, and glutamate. Such cellular responses to USPIO were also detectable in compositional changes of cell media, showing an obvious depletion of the primary nutrition molecules, such as glucose and amino acids and the production of end-products of glycolysis, such as pyruvate, acetate, and lactate and intermediates of TCA cycle such as succinate and citrate. At 48 h treatment, there was a differential response to incubation with USPIO in both cell metabonome and medium components, indicating that USPIO are phagocytosed and released by macrophages. Furthermore, information on cell membrane modification can be derived from the changes in choline-like metabolites. These results not only suggest that NMR-based metabonomic methods have sufficient sensitivity to identify the metabolic consequences of murine RAW264.7 macrophage-like cell line response to USPIO in vitro, but also provide useful information on the effects of USPIO on cellular metabolism.

  8. Molecular phylogeography of the brown bear (Ursus arctos) in Northeastern Asia based on analyses of complete mitochondrial DNA sequences.

    Science.gov (United States)

    Hirata, Daisuke; Mano, Tsutomu; Abramov, Alexei V; Baryshnikov, Gennady F; Kosintsev, Pavel A; Vorobiev, Alexandr A; Raichev, Evgeny G; Tsunoda, Hiroshi; Kaneko, Yayoi; Murata, Koichi; Fukui, Daisuke; Masuda, Ryuichi

    2013-07-01

    To further elucidate the migration history of the brown bears (Ursus arctos) on Hokkaido Island, Japan, we analyzed the complete mitochondrial DNA (mtDNA) sequences of 35 brown bears from Hokkaido, the southern Kuril Islands (Etorofu and Kunashiri), Sakhalin Island, and the Eurasian Continent (continental Russia, Bulgaria, and Tibet), and those of four polar bears. Based on these sequences, we reconstructed the maternal phylogeny of the brown bear and estimated divergence times to investigate the timing of brown bear migrations, especially in northeastern Eurasia. Our gene tree showed the mtDNA haplotypes of all 73 brown and polar bears to be divided into eight divergent lineages. The brown bear on Hokkaido was divided into three lineages (central, eastern, and southern). The Sakhalin brown bear grouped with eastern European and western Alaskan brown bears. Etorofu and Kunashiri brown bears were closely related to eastern Hokkaido brown bears and could have diverged from the eastern Hokkaido lineage after formation of the channel between Hokkaido and the southern Kuril Islands. Tibetan brown bears diverged early in the eastern lineage. Southern Hokkaido brown bears were closely related to North American brown bears.

  9. Considerations on the age of the Bambui Group (MG, Brazil) based on isotopic analyses of Sr and Pb

    International Nuclear Information System (INIS)

    Couto, J.G.P.; Cordani, U.G.; Kawashita, K.

    1981-01-01

    Based on radiometric ages, the Bambui Group deposition time is related to the end of the Precambrian. However, the ages determined and released through scientific magazines are mot in agreement (600-1350 m.y.) and many doubts about the geochrological picture of this important lithostratigraphic unit remained for a long time. As a result of the work developed by Metamig, CPGeo (IG-USP) and IPEN (SP), Rb/Sr and Pb/Pb isotopic determinations were done on 31 rocks samples and 17 galenas collected from the Bambui Basin distributed in Minas Gerais State. The Rb/Sr ages of 590 m.y. for Pirapora Formation, 620 m.y. for Tres Marias Formation, and 640 m.y. for the Paraopeba Formation situated in the stable area are linked to sedimentation processes. In the Paracatu region the age of 680 m.y. found for the Paraopeba Formation is related to metamorphic events. The lead isotopic ratios from the galenas suggest an isotopic evolution in two stages. The first ended with the lead separation from the mantle and its incorporation to the crust during events of the Transamazonic Cycle. The second ended when the lead were incorporated to the galenas and seems to be related to one or more events of the Brazilian Cycle. (Author) [pt

  10. Analysing the future of Broad-Based Black Economic Empowerment through the lens of small and medium enterprises

    Directory of Open Access Journals (Sweden)

    Angela Pike

    2018-06-01

    Full Text Available Orientation: The current Broad-Based Black Economic Empowerment (BBBEE legislation imposes direct obstacles on small and medium enterprises (SMEs in South Africa (SA. Thus, the perceptions of SMEs on the future of BBBEE elucidate the effect of the legislation on the economy and its operating industries. Research purpose: The study had an objective of comprehending the future of BBBEE and its effect on the SA economy and operating industries through the perceptions of SMEs. Motivation for the study: The study’s objective provided new insights and a profound understanding of BBBEE and its influence on the economy and operating industries. Research design, approach and method: The research followed a qualitative discipline with the use of a semi-structured interview to collect the empirical data. The study consisted of 22 participants, with one participant being excluded because of omissions identified. Main findings: The findings exemplified that BBBEE was promoting tender corruption and economic strain. Thus, the participants emphasised a restructured BBBEE model for the future. Practical and managerial implications: The findings invite policymakers to restructure the current BBBEE legislation so that it could promote equality. Furthermore, SMEs could relate to the industry effects and implement strategies to manage such effects on their businesses. Contribution or value-add: The findings contribute towards new research insights that determine the future of BBBEE.

  11. Modelling software failures of digital I and C in probabilistic safety analyses based on the TELEPERM registered XS operating experience

    International Nuclear Information System (INIS)

    Jockenhoevel-Barttfeld, Mariana; Taurines Andre; Baeckstroem, Ola; Holmberg, Jan-Erik; Porthin, Markus; Tyrvaeinen, Tero

    2015-01-01

    Digital instrumentation and control (I and C) systems appear as upgrades in existing nuclear power plants (NPPs) and in new plant designs. In order to assess the impact of digital system failures, quantifiable reliability models are needed along with data for digital systems that are compatible with existing probabilistic safety assessments (PSA). The paper focuses on the modelling of software failures of digital I and C systems in probabilistic assessments. An analysis of software faults, failures and effects is presented to derive relevant failure modes of system and application software for the PSA. The estimations of software failure probabilities are based on an analysis of the operating experience of TELEPERM registered XS (TXS). For the assessment of application software failures the analysis combines the use of the TXS operating experience at an application function level combined with conservative engineering judgments. Failure probabilities to actuate on demand and of spurious actuation of typical reactor protection application are estimated. Moreover, the paper gives guidelines for the modelling of software failures in the PSA. The strategy presented in this paper is generic and can be applied to different software platforms and their applications.

  12. NMR-based metabonomic analyses of the effects of ultrasmall superparamagnetic particles of iron oxide (USPIO) on macrophage metabolism

    International Nuclear Information System (INIS)

    Feng Jianghua; Zhao Jing; Hao Fuhua; Chen Chang; Bhakoo, Kishore; Tang, Huiru

    2011-01-01

    The metabonomic changes in murine RAW264.7 macrophage-like cell line induced by ultrasmall superparamagnetic particles of iron oxides (USPIO) have been investigated, by analyzing both the cells and culture media, using high-resolution NMR in conjunction with multivariate statistical methods. Upon treatment with USPIO, macrophage cells showed a significant decrease in the levels of triglycerides, essential amino acids such as valine, isoleucine, and choline metabolites together with an increase of glycerophospholipids, tyrosine, phenylalanine, lysine, glycine, and glutamate. Such cellular responses to USPIO were also detectable in compositional changes of cell media, showing an obvious depletion of the primary nutrition molecules, such as glucose and amino acids and the production of end-products of glycolysis, such as pyruvate, acetate, and lactate and intermediates of TCA cycle such as succinate and citrate. At 48 h treatment, there was a differential response to incubation with USPIO in both cell metabonome and medium components, indicating that USPIO are phagocytosed and released by macrophages. Furthermore, information on cell membrane modification can be derived from the changes in choline-like metabolites. These results not only suggest that NMR-based metabonomic methods have sufficient sensitivity to identify the metabolic consequences of murine RAW264.7 macrophage-like cell line response to USPIO in vitro, but also provide useful information on the effects of USPIO on cellular metabolism.

  13. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: II, application to decayed human teeth.

    Science.gov (United States)

    Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato

    2015-05-01

    A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.

  14. Integration of HPLC-based fingerprint and quantitative analyses for differentiating botanical species and geographical growing origins of Rhizoma coptidis.

    Science.gov (United States)

    Lv, Xiumei; Li, Yan; Tang, Ce; Zhang, Yi; Zhang, Jing; Fan, Gang

    2016-12-01

    Rhizoma coptidis is a broadly used traditional Chinese medicine (TCM). The investigation of the influence of species and geographical origins on the phytochemicals of R. coptidis is crucial for its reasonable application and quality control. Development of an effective method to systematically study the phytochemical variations of the rhizomes of three Coptis species (Ranunculaceae) (Coptis chinensis Franch, Coptis deltoidea C.Y. Cheng et Hsiao and Coptis teeta Wall.) and a species (i.e., C. chinensis) obtained from both Daodi and non-Daodi production regions. The three species had significant differences in their phytochemicals. The rhizome of C. chinensis contained more epiberberine (13.52 ± 2.65 mg/g), palmatine (18.20 ± 2.89 mg/g), coptisine (23.32 ± 4.27 mg/g) and columbamine (4.89 ± 1.16 mg/g), whereas the rhizomes of C. deltoidea and C. teeta showed the highest level of jatrorrhizine (8.52 ± 1.36 mg/g) and berberine (81.06 ± 4.83 mg/g), respectively. Moreover, the rhizome of C. chinensis from three Daodi production regions (Shizhu, Lichuan and Emeishan) contained more alkaloids than those from three non-Daodi production regions (Mianyang, Shifang and Zhenping). It is necessary to use the three R. coptidis species differentially in TCM clinical practice. Daodi C. chinensis medicinal materials have better quality than most non-Daodi ones, and so they should be preferred for TCM prescription. The combination of HPLC-based fingerprint analysis and quantification of multi-ingredients with statistical analysis provided an effective approach for species discrimination and quality evaluation of R. coptidis.

  15. Characterizing spatial heterogeneity based on the b-value and fractal analyses of the 2015 Nepal earthquake sequence

    Science.gov (United States)

    Nampally, Subhadra; Padhy, Simanchal; Dimri, Vijay P.

    2018-01-01

    The nature of spatial distribution of heterogeneities in the source area of the 2015 Nepal earthquake is characterized based on the seismic b-value and fractal analysis of its aftershocks. The earthquake size distribution of aftershocks gives a b-value of 1.11 ± 0.08, possibly representing the highly heterogeneous and low stress state of the region. The aftershocks exhibit a fractal structure characterized by a spectrum of generalized dimensions, Dq varying from D2 = 1.66 to D22 = 0.11. The existence of a fractal structure suggests that the spatial distribution of aftershocks is not a random phenomenon, but it self-organizes into a critical state, exhibiting a scale-independent structure governed by a power-law scaling, where a small perturbation in stress is sufficient enough to trigger aftershocks. In order to obtain the bias in fractal dimensions resulting from finite data size, we compared the multifractal spectrum for the real data and random simulations. On comparison, we found that the lower limit of bias in D2 is 0.44. The similarity in their multifractal spectra suggests the lack of long-range correlation in the data, with an only weakly multifractal or a monofractal with a single correlation dimension D2 characterizing the data. The minimum number of events required for a multifractal process with an acceptable error is discussed. We also tested for a possible correlation between changes in D2 and energy released during the earthquakes. The values of D2 rise during the two largest earthquakes (M > 7.0) in the sequence. The b- and D2 values are related by D2 = 1.45 b that corresponds to the intermediate to large earthquakes. Our results provide useful constraints on the spatial distribution of b- and D2-values, which are useful for seismic hazard assessment in the aftershock area of a large earthquake.

  16. Porosity and permeability determination of organic-rich Posidonia shales based on 3-D analyses by FIB-SEM microscopy

    Science.gov (United States)

    Grathoff, Georg H.; Peltz, Markus; Enzmann, Frieder; Kaufhold, Stephan

    2016-07-01

    The goal of this study is to better understand the porosity and permeability in shales to improve modelling fluid and gas flow related to shale diagenesis. Two samples (WIC and HAD) were investigated, both mid-Jurassic organic-rich Posidonia shales from Hils area, central Germany of different maturity (WIC R0 0.53 % and HAD R0 1.45 %). The method for image collection was focused ion beam (FIB) microscopy coupled with scanning electron microscopy (SEM). For image and data analysis Avizo and GeoDict was used. Porosity was calculated from segmented 3-D FIB based images and permeability was simulated by a Navier Stokes-Brinkman solver in the segmented images. Results show that the quantity and distribution of pore clusters and pores (≥ 40 nm) are similar. The largest pores are located within carbonates and clay minerals, whereas the smallest pores are within the matured organic matter. Orientation of the pores calculated as pore paths showed minor directional differences between the samples. Both samples have no continuous connectivity of pore clusters along the axes in the x, y, and z direction on the scale of 10 to 20 of micrometer, but do show connectivity on the micrometer scale. The volume of organic matter in the studied volume is representative of the total organic carbon (TOC) in the samples. Organic matter does show axis connectivity in the x, y, and z directions. With increasing maturity the porosity in organic matter increases from close to 0 to more than 5 %. These pores are small and in the large organic particles have little connection to the mineral matrix. Continuous pore size distributions are compared with mercury intrusion porosimetry (MIP) data. Differences between both methods are caused by resolution limits of the FIB-SEM and by the development of small pores during the maturation of the organic matter. Calculations show no permeability when only considering visible pores due to the lack of axis connectivity. Adding the organic matter with a

  17. Analysing the Hydraulic Actuator-based Knee Unit Kinematics and Correlating the Numerical Results and Walking Human Knee Joint Behavior

    Directory of Open Access Journals (Sweden)

    K. A. Trukhanov

    2014-01-01

    Full Text Available State-of-the-art machinery development enables people with lost lower limb to continue their previous life despite a loss. International companies dealing with this area pursue a minimization of human behaviour problems because of amputation. Researches to create an optimal design of the artificial knee joint are under way.The work task was to define analytical relationships of changing kinematic parameters of the human walking on the flat surface such as an angle of the knee joint, knee point (moment, definition of reduced knee actuator (A load, as well as to compare obtained results with experimental data.As an A in created design, the article proposes to use a controlled shock absorber based on the hydraulic cylinder.A knee unit is a kinematic two-tier mechanism. One of the mechanism links performs rotational motion, and the other is rotation-translational to provide a rotation of the first one.When studying the hydraulic actuator device dynamics, as a generalized coordinate a coordinate of the piston x (or ρ position is chosen while in the study of link movements an angle β is preferable.Experimental data are obtained for a human with the body weight of 57.6 kg walking on the flat surface to estimate a value of the knee joint angle, speed, acceleration, torque, and capacity in the knee joint and are taken from the published works of foreign authors.A trigonometric approximation was used for fitting the experimental data. The resulting dependence of the reduced load on the stock of A is necessary to perform the synthesis of A. The criterion for linear mechanisms mentioned in the D.N. Popov’s work is advisable to use as a possible criterion for optimization of A.The results obtained are as follows:1. Kinematics linkage mechanism is described using relationships for dependencies of its geometrical parameters, namely a cylinder piston stroke x (or ρ and a links angle β.2. Obtained polynomials of kinematic relationships allow a synthesis of

  18. Late Frasnian-Famennian climates based on palynomorph analyses and the question of the Late Devonian glaciations

    Science.gov (United States)

    Streel, Maurice; Caputo, Mário V.; Loboziak, Stanislas; Melo, José Henrique G.

    2000-11-01

    Palynomorph distribution in Euramerica and western Gondwana, from the Latest Givetian to the Latest Famennian, may be explained, to some extent, by climatic changes. Detailed miospore stratigraphy dates accurately the successive steps of these changes. Interpretation is built on three postulates which are discussed: Euramerica at slightly lower latitudes than generally accepted by most paleomagnetic reconstructions; a conodont time-scale accepted as the most used available subdivision of time; and Late Devonian sea-level fluctuations mainly governed by glacio-eustasy. The Frasnian-Famennian timescale is also evaluated. The comparison, based on conodont correlations, between Givetian and most of the Frasnian miospore assemblages from, respectively, northern and southern Euramerica demonstrates a high taxonomic diversity in the equatorial belt and much difference between supposed equatorial and (sub) tropical vegetations. On the contrary, a similar vegetation pattern and therefore probably compatible climatic conditions were present from tropical to subpolar areas. A rather hot climate culminated during the Latest Frasnian when equatorial miospore assemblages reached their maximum width. The miospore diversity shows also a rather clear global Late Frasnian minimum which is also recorded during the Early and Middle Famennian but only in low latitude regions while, in high latitude, very cold climates without perennial snow may explain the scarcity of miospores and so, of vegetation. The Early and Middle Famennian conspicuous latitudinal gradient of the vegetation seems to attenuate towards the Late and Latest Famennian but this might be above all the result of the development, of cosmopolitan coastal lowland vegetations (downstream swamps) depending more on the moisture and equable local microclimates than on the probably adverse climates of distant hinterland areas. During that time, periods of cold climate without perennial snow cover and with rare vegetation may

  19. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  20. Localisation of nursery areas based on comparative analyses of the horizontal and vertical distribution patterns of juvenile Baltic cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Lundgren, Bo; Kristensen, Kasper

    2013-01-01

    Baltic cod are determined, and their nursery areas are localised according to the environmental factors affecting them. Comparative statistical analyses of biological, hydrographic and hydroacoustic data are carried out based on standard ICES demersal trawl surveys and special integrated trawl...... and acoustic research surveys. Horizontal distribution maps for the 2001–2010 cohorts of juvenile cod are further generated by applying a statistical log-Gaussian Cox process model to the standard trawl survey data. The analyses indicate size-dependent horizontal and distinct vertical and diurnal distribution...... in deep sea localities down to a 100 m depth and at oxygen concentrations between 2–4 ml O2.l−1. The vertical, diurnally stratified and repeated trawling and hydroacoustic target strength-depth distributions obtained from the special surveys show juvenile cod concentrations in frontal zone water layers...

  1. Implementation of analyses based on social media data for marketing purposes in academic and scientific organizations in practice – opportunities and limitations

    Directory of Open Access Journals (Sweden)

    Magdalena Grabarczyk-Tokaj

    2013-12-01

    Full Text Available The article is focused on the issue of practice use of analyses, based on data collected in social media, for institutions’ communication and marketing purposes. The subject is being discussed from the perspective of Digital Darwinism — situation, when development of technologies and new means of communication is significantly faster than growth in the knowledge and digital skills among organizations eager to implement those solutions. To diminish negative consequences of Digital Darwinism institutions can broaden their knowledge with analyses of data from cyber space to optimize operations, and make use of running dialog and cooperation with prosuments to face dynamic changes in trends, technologies and society. Information acquired from social media user generated content can be employed as guidelines in planning, running and evaluating communication and marketing activities. The article presents examples of tools and solutions, that can be implement in practice as a support for actions taken by institutions.

  2. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  3. PALEO-CHANNELS OF SINGKAWANG WATERS WEST KALIMANTAN AND ITS RELATION TO THE OCCURRENCES OF SUB-SEABOTTOM GOLD PLACERS BASED ON STRATA BOX SEISMIC RECORD ANALYSES

    Directory of Open Access Journals (Sweden)

    Hananto Kurnio

    2017-07-01

    Full Text Available Strata box seismic records were used to analyze sub-seabottom paleochannels in Singkawang Waters, West Kalimantan. Based on the analyses, it can be identified the distribution and patterns of paleochannels. Paleo channel at northern part of study area interpreted as a continuation of Recent coastal rivers; and at the southern part, the pattern radiates surround the cone-shaped morphology of islands, especially Kabung and Lemukutan Islands. Paleochannels of the study area belong to northwest Sunda Shelf systems that terminated to the South China Sea. A study on sequence stratigraphy was carried out to better understanding sedimentary sequences in the paleochannels. This study is also capable of identifying placer deposits within the channels. Based on criterias of gold placer occurrence such as existence of primary gold sources, intense chemical and physical weathering to liberate gold grains from their source rocks of Sintang Intrusive. Gravity transportation that involved water media, stable bed rock and surface conditions, caused offshore area of Singkawang fulfill requirements for gold placer accumulations. Chemical and physical whethering proccesses from Oligocene to Recent, approximately 36 million, might be found accumulation of gold placer on the seafloor. Based on grain size analyses, the study area consisted of sand 43.4%, silt 54.3% and clay 2.3%. Petrographic examination of the sample shows gold grains about 0.2%.

  4. Mechanical analyses on the digital behaviour of the Tokay gecko (Gekko gecko) based on a multi-level directional adhesion model

    OpenAIRE

    Wu, Xuan; Wang, Xiaojie; Mei, Tao; Sun, Shaoming

    2015-01-01

    This paper proposes a multi-level hierarchical model for the Tokay gecko (Gekko gecko) adhesive system and analyses the digital behaviour of the G. gecko under macro/meso-level scale. The model describes the structures of G. gecko's adhesive system from the nano-level spatulae to the sub-millimetre-level lamella. The G. gecko's seta is modelled using inextensible fibril based on Euler's elastica theorem. Considering the side contact of the spatular pads of the seta on the flat and rigid subst...

  5. Boron analyses in the reactor coolant system of French PWR by acid-base titration ([B]) and ICP-MS (10B atomic %): key to NPP safety

    International Nuclear Information System (INIS)

    Jouvet, Fabien; Roux, Sylvie; Carabasse, Stephanie; Felgines, Didier

    2012-09-01

    Boron is widely used by Nuclear Power Plants and especially by EDF Pressurized Water Reactors to ensure the control of the neutron rate in the reactor coolant system and, by this way, the fission reaction. The Boron analysis is thus a major factor of safety which enables operators to guarantee the permanent control of the reactor. Two kinds of analyses carried out by EDF on the Boron species, recently upgraded regarding new method validation standards and developed to enhance the measurement quality by reducing uncertainties, will be discussed in this topic: Acid-Base titration of Boron and Boron isotopic composition by Inductively Coupled Plasma Mass Spectrometer - ICP MS. (authors)

  6. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2013-03-01

    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  7. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0.

    Science.gov (United States)

    Cong, Jing; Liu, Xueduan; Lu, Hui; Xu, Han; Li, Yide; Deng, Ye; Li, Diqiang; Zhang, Yuguang

    2015-09-01

    To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171).

  8. Phylogenetic analyses of Vitis (Vitaceae) based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids.

    Science.gov (United States)

    Jansen, Robert K; Kaittanis, Charalambos; Saski, Christopher; Lee, Seung-Bum; Tomkins, Jeffrey; Alverson, Andrew J; Daniell, Henry

    2006-04-09

    The Vitaceae (grape) is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade. However, maximum likelihood analyses place

  9. Phylogenetic analyses of Vitis (Vitaceae based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids

    Directory of Open Access Journals (Sweden)

    Alverson Andrew J

    2006-04-01

    Full Text Available Abstract Background The Vitaceae (grape is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. Results The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade

  10. SuperTRI: A new approach based on branch support analyses of multiple independent data sets for assessing reliability of phylogenetic inferences.

    Science.gov (United States)

    Ropiquet, Anne; Li, Blaise; Hassanin, Alexandre

    2009-09-01

    Supermatrix and supertree are two methods for constructing a phylogenetic tree by using multiple data sets. However, these methods are not a panacea, as conflicting signals between data sets can lead to misinterpret the evolutionary history of taxa. In particular, the supermatrix approach is expected to be misleading if the species-tree signal is not dominant after the combination of the data sets. Moreover, most current supertree methods suffer from two limitations: (i) they ignore or misinterpret secondary (non-dominant) phylogenetic signals of the different data sets; and (ii) the logical basis of node robustness measures is unclear. To overcome these limitations, we propose a new approach, called SuperTRI, which is based on the branch support analyses of the independent data sets, and where the reliability of the nodes is assessed using three measures: the supertree Bootstrap percentage and two other values calculated from the separate analyses: the mean branch support (mean Bootstrap percentage or mean posterior probability) and the reproducibility index. The SuperTRI approach is tested on a data matrix including seven genes for 82 taxa of the family Bovidae (Mammalia, Ruminantia), and the results are compared to those found with the supermatrix approach. The phylogenetic analyses of the supermatrix and independent data sets were done using four methods of tree reconstruction: Bayesian inference, maximum likelihood, and unweighted and weighted maximum parsimony. The results indicate, firstly, that the SuperTRI approach shows less sensitivity to the four phylogenetic methods, secondly, that it is more accurate to interpret the relationships among taxa, and thirdly, that interesting conclusions on introgression and radiation can be drawn from the comparisons between SuperTRI and supermatrix analyses.

  11. Understanding ageing in older Australians: The contribution of the Dynamic Analyses to Optimise Ageing (DYNOPTA) project to the evidenced base and policy

    Science.gov (United States)

    Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D

    2014-01-01

    Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767

  12. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    Directory of Open Access Journals (Sweden)

    Han Bossier

    2018-01-01

    Full Text Available Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1 the balance between false and true positives and (2 the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS, or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35. To do this, we apply a resampling scheme on a large dataset (N = 1,400 to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results.

  13. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  14. Material analyses of foam-based SiC FCI after dynamic testing in PbLi in MaPLE loop at UCLA

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Maria, E-mail: maria.gonzalez@ciemat.es [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Rapisarda, David; Ibarra, Angel [LNF-CIEMAT, Avda Complutense, 40, 28040 Madrid (Spain); Courtessole, Cyril; Smolentsev, Sergey; Abdou, Mohamed [Fusion Science and Technology Center, UCLA (United States)

    2016-11-01

    Highlights: • Samples from foam-based SiC FCI were analyzed by looking at their SEM microstructure and elemental composition. • After finishing dynamic experiments in the flowing hot PbLi, the liquid metal ingress has been confirmed due to infiltration through local defects in the protective inner CVD layer. • No direct evidences of corrosion/erosion were observed; these defects could be related to the manufacturing process. - Abstract: Foam-based SiC flow channel inserts (FCIs) developed and manufactured by Ultramet, USA are currently under testing in the flowing hot lead-lithium (PbLi) alloy in the MaPLE loop at UCLA to address chemical/physical compatibility and to access the MHD pressure drop reduction. UCLA has finished the first experimental series, where a single uninterrupted long-term (∼6500 h) test was performed on a 30-cm FCI segment in a magnetic field up to 1.8 T at the temperature of 300 °C and maximum flow velocities of ∼ 15 cm/s. After finishing the experiments, the FCI sample was extracted from the host stainless steel duct and cut into slices. Few of them have been analyzed at CIEMAT as a part of the joint collaborative effort on the development of the DCLL blanket concept in the EU and the US. The initial inspection of the slices using optical microscopic analysis at UCLA showed significant PbLi ingress into the bulk FCI material that resulted in degradation of insulating properties of the FCI. Current material analyses at CIEMAT are based on advanced techniques, including characterization of FCI samples by FESEM to study PbLi ingress, imaging of cross sections, composition analysis by EDX and crack inspection. These analyses suggest that the ingress was caused by local defects in the protective inner CVD layer that might be originally present in the FCI or occurred during testing.

  15. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms.

    Science.gov (United States)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette; Flachs, Esben Meulengracht; Hasselbalch, Hans Carl; Andersen, Christen Lykkegaard

    2016-04-01

    An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream statistical processing. The aim of this study was to investigate, whether patients with myeloproliferative neoplasms (MPNs) were willing and able to use the tool and fill out questionnaires regularly. Participants were recruited from the outpatient clinic at the Department of Haematology, Roskilde University Hospital, Denmark. Validated questionnaires that were used were European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-Core 30, Myeloproliferative Neoplasm Symptom Assessment Form, Brief Fatigue Inventory and Short Form 36 Health Survey. Questionnaires were filled out ≥ 6 months online or on paper according to participant preference. Regularity of questionnaire submission was investigated, and participant acceptance was evaluated by focus-group interviews. Of 135 invited patients, 118 (87 %) accepted participation. One hundred and seven participants (91 %) preferred to use the Internet-based tool. Of the 118 enrolled participants, 104 (88 %) submitted PROs regularly ≥ 6 months. The focus-group interviews revealed that the Internet-based tool was well accepted. The Internet-based approach and regular collection of PROs are well accepted with a high participation rate, persistency and adherence in a population of MPN patients. The plasticity of the platform allows for adaptation to patients with other medical conditions.

  16. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  17. Development of SI-traceable C-peptide certified reference material NMIJ CRM 6901-a using isotope-dilution mass spectrometry-based amino acid analyses.

    Science.gov (United States)

    Kinumi, Tomoya; Goto, Mari; Eyama, Sakae; Kato, Megumi; Kasama, Takeshi; Takatsu, Akiko

    2012-07-01

    A certified reference material (CRM) is a higher-order calibration material used to enable a traceable analysis. This paper describes the development of a C-peptide CRM (NMIJ CRM 6901-a) by the National Metrology Institute of Japan using two independent methods for amino acid analysis based on isotope-dilution mass spectrometry. C-peptide is a 31-mer peptide that is utilized for the evaluation of β-cell function in the pancreas in clinical testing. This CRM is a lyophilized synthetic peptide having the human C-peptide sequence, and contains deamidated and pyroglutamylated forms of C-peptide. By adding water (1.00 ± 0.01) g into the vial containing the CRM, the C-peptide solution in 10 mM phosphate buffer saline (pH 6.6) is reconstituted. We assigned two certified values that represent the concentrations of total C-peptide (mixture of C-peptide, deamidated C-peptide, and pyroglutamylated C-peptide) and C-peptide. The certified concentration of total C-peptide was determined by two amino acid analyses using pre-column derivatization liquid chromatography-mass spectrometry and hydrophilic chromatography-mass spectrometry following acid hydrolysis. The certified concentration of C-peptide was determined by multiplying the concentration of total C-peptide by the ratio of the relative area of C-peptide to that of the total C-peptide measured by liquid chromatography. The certified value of C-peptide (80.7 ± 5.0) mg/L represents the concentration of the specific entity of C-peptide; on the other hand, the certified value of total C-peptide, (81.7 ± 5.1) mg/L can be used for analyses that does not differentiate deamidated and pyroglutamylated C-peptide from C-peptide itself, such as amino acid analyses and immunochemical assays.

  18. Systematic review of model-based analyses reporting the cost-effectiveness and cost-utility of cardiovascular disease management programs.

    Science.gov (United States)

    Maru, Shoko; Byrnes, Joshua; Whitty, Jennifer A; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A

    2015-02-01

    The reported cost effectiveness of cardiovascular disease management programs (CVD-MPs) is highly variable, potentially leading to different funding decisions. This systematic review evaluates published modeled analyses to compare study methods and quality. Articles were included if an incremental cost-effectiveness ratio (ICER) or cost-utility ratio (ICUR) was reported, it is a multi-component intervention designed to manage or prevent a cardiovascular disease condition, and it addressed all domains specified in the American Heart Association Taxonomy for Disease Management. Nine articles (reporting 10 clinical outcomes) were included. Eight cost-utility and two cost-effectiveness analyses targeted hypertension (n=4), coronary heart disease (n=2), coronary heart disease plus stoke (n=1), heart failure (n=2) and hyperlipidemia (n=1). Study perspectives included the healthcare system (n=5), societal and fund holders (n=1), a third party payer (n=3), or was not explicitly stated (n=1). All analyses were modeled based on interventions of one to two years' duration. Time horizon ranged from two years (n=1), 10 years (n=1) and lifetime (n=8). Model structures included Markov model (n=8), 'decision analytic models' (n=1), or was not explicitly stated (n=1). Considerable variation was observed in clinical and economic assumptions and reporting practices. Of all ICERs/ICURs reported, including those of subgroups (n=16), four were above a US$50,000 acceptability threshold, six were below and six were dominant. The majority of CVD-MPs was reported to have favorable economic outcomes, but 25% were at unacceptably high cost for the outcomes. Use of standardized reporting tools should increase transparency and inform what drives the cost-effectiveness of CVD-MPs. © The European Society of Cardiology 2014.

  19. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0

    Directory of Open Access Journals (Sweden)

    Jing Cong

    2015-09-01

    Full Text Available To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171.

  20. Effect of a novel motion correction algorithm (SSF) on the image quality of coronary CTA with intermediate heart rates: Segment-based and vessel-based analyses

    Energy Technology Data Exchange (ETDEWEB)

    Li, Qianwen, E-mail: qianwen18@126.com; Li, Pengyu, E-mail: lipyu818@gmail.com; Su, Zhuangzhi, E-mail: suzhuangzhi@xwh.ccmu.edu.cn; Yao, Xinyu, E-mail: 314985151@qq.com; Wang, Yan, E-mail: wy19851121@126.com; Wang, Chen, E-mail: fskwangchen@gmail.com; Du, Xiangying, E-mail: duxying_xw@163.com; Li, Kuncheng, E-mail: kuncheng.li@gmail.com

    2014-11-15

    Highlights: • SSF provided better image quality than single-sector and bi-sector reconstruction among the intermediate heart rates (65–75 bpm). • Evidence for the application of prospective ECG-triggered coronary CTA with SSF onto an expanded heart rate range. • Information about the inconsistent effectiveness of SSF among the segments of coronary artery. - Abstract: Purpose: To evaluate the effect of SnapShot Freeze (SSF) reconstruction at an intermediate heart-rate (HR) range (65–75 bpm) and compare this method with single-sector reconstruction and bi-sector reconstruction on segmental and vessel bases in retrospective coronary computed tomography angiography (CCTA). Materials and methods: Retrospective electrocardiogram-gated CCTA was performed on 37 consecutive patients with HR between 65 and 75 bpm using a 64-row CT scanner. Retrospective single-sector reconstruction, bi-sector reconstruction, and SSF were performed for each patient. Multi-phase single-sector reconstruction was performed to select the optimal phase. SSF and bi-sector images were also reconstructed at the optimal phase. The images were interpreted in an intent-to-diagnose fashion by two experienced readers using a 5-point scale, with 3 points as diagnostically acceptable. Image quality among the three reconstruction groups were compared on per-patient, per-vessel, and per-segment bases. Results: The average HR of the enrolled patients was 69.4 ± 2.7 bpm. A total of 111 vessels and 481 coronary segments were assessed. SSF provided significantly higher interpretability of the coronary segments than bi-sector reconstructions. The qualified and excellent rates of SSF (97.9% and 82.3%) were significantly higher than those of single-sector (92.9% and 66.3%) and bi-sector (90.9% and 64.7%) reconstructions. The image quality score (IQS) using SSF was also significantly higher than those of single-sector and bi-sector reconstructions both on per-patient and per-vessel bases. On per

  1. Differentiation of Toxocara canis and Toxocara cati based on PCR-RFLP analyses of rDNA-ITS and mitochondrial cox1 and nad1 regions.

    Science.gov (United States)

    Mikaeili, Fattaneh; Mathis, Alexander; Deplazes, Peter; Mirhendi, Hossein; Barazesh, Afshin; Ebrahimi, Sepideh; Kia, Eshrat Beigom

    2017-09-26

    The definitive genetic identification of Toxocara species is currently based on PCR/sequencing. The objectives of the present study were to design and conduct an in silico polymerase chain reaction-restriction fragment length polymorphism method for identification of Toxocara species. In silico analyses using the DNASIS and NEBcutter softwares were performed with rDNA internal transcribed spacers, and mitochondrial cox1 and nad1 sequences obtained in our previous studies along with relevant sequences deposited in GenBank. Consequently, RFLP profiles were designed and all isolates of T. canis and T. cati collected from dogs and cats in different geographical areas of Iran were investigated with the RFLP method using some of the identified suitable enzymes. The findings of in silico analyses predicted that on the cox1 gene only the MboII enzyme is appropriate for PCR-RFLP to reliably distinguish the two species. No suitable enzyme for PCR-RFLP on the nad1 gene was identified that yields the same pattern for all isolates of a species. DNASIS software showed that there are 241 suitable restriction enzymes for the differentiation of T. canis from T. cati based on ITS sequences. RsaI, MvaI and SalI enzymes were selected to evaluate the reliability of the in silico PCR-RFLP. The sizes of restriction fragments obtained by PCR-RFLP of all samples consistently matched the expected RFLP patterns. The ITS sequences are usually conserved and the PCR-RFLP approach targeting the ITS sequence is recommended for the molecular differentiation of Toxocara species and can provide a reliable tool for identification purposes particularly at the larval and egg stages.

  2. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  3. Clinical Research That Matters: Designing Outcome-Based Research for Older Adults to Qualify for Systematic Reviews and Meta-Analyses.

    Science.gov (United States)

    Lee, Jeannie K; Fosnight, Susan M; Estus, Erica L; Evans, Paula J; Pho, Victoria B; Reidt, Shannon; Reist, Jeffrey C; Ruby, Christine M; Sibicky, Stephanie L; Wheeler, Janel B

    2018-01-01

    Though older adults are more sensitive to the effects of medications than their younger counterparts, they are often excluded from manufacturer-based clinical studies. Practice-based research is a practical method to identify medication-related effects in older patients. This research also highlights the role of a pharmacist in improving care in this population. A single study rarely has strong enough evidence to change geriatric practice, unless it is a large-scale, multisite, randomized controlled trial that specifically targets older adults. It is important to design studies that may be used in systematic reviews or meta-analyses that build a stronger evidence base. Recent literature has documented a gap in advanced pharmacist training pertaining to research skills. In this paper, we hope to fill some of the educational gaps related to research in older adults. We define best practices when deciding on the type of study, inclusion and exclusion criteria, design of the intervention, how outcomes are measured, and how results are reported. Well-designed studies increase the pool of available data to further document the important role that pharmacists have in optimizing care of older patients.

  4. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. The Japanese Society of Pathology Guidelines on the handling of pathological tissue samples for genomic research: Standard operating procedures based on empirical analyses.

    Science.gov (United States)

    Kanai, Yae; Nishihara, Hiroshi; Miyagi, Yohei; Tsuruyama, Tatsuhiro; Taguchi, Kenichi; Katoh, Hiroto; Takeuchi, Tomoyo; Gotoh, Masahiro; Kuramoto, Junko; Arai, Eri; Ojima, Hidenori; Shibuya, Ayako; Yoshida, Teruhiko; Akahane, Toshiaki; Kasajima, Rika; Morita, Kei-Ichi; Inazawa, Johji; Sasaki, Takeshi; Fukayama, Masashi; Oda, Yoshinao

    2018-02-01

    Genome research using appropriately collected pathological tissue samples is expected to yield breakthroughs in the development of biomarkers and identification of therapeutic targets for diseases such as cancers. In this connection, the Japanese Society of Pathology (JSP) has developed "The JSP Guidelines on the Handling of Pathological Tissue Samples for Genomic Research" based on an abundance of data from empirical analyses of tissue samples collected and stored under various conditions. Tissue samples should be collected from appropriate sites within surgically resected specimens, without disturbing the features on which pathological diagnosis is based, while avoiding bleeding or necrotic foci. They should be collected as soon as possible after resection: at the latest within about 3 h of storage at 4°C. Preferably, snap-frozen samples should be stored in liquid nitrogen (about -180°C) until use. When intending to use genomic DNA extracted from formalin-fixed paraffin-embedded tissue, 10% neutral buffered formalin should be used. Insufficient fixation and overfixation must both be avoided. We hope that pathologists, clinicians, clinical laboratory technicians and biobank operators will come to master the handling of pathological tissue samples based on the standard operating procedures in these Guidelines to yield results that will assist in the realization of genomic medicine. © 2018 The Authors. Pathology International published by Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.

  6. Improving correlations between MODIS aerosol optical thickness and ground-based PM 2.5 observations through 3D spatial analyses

    Science.gov (United States)

    Hutchison, Keith D.; Faruqui, Shazia J.; Smith, Solar

    The Center for Space Research (CSR) continues to focus on developing methods to improve correlations between satellite-based aerosol optical thickness (AOT) values and ground-based, air pollution observations made at continuous ambient monitoring sites (CAMS) operated by the Texas commission on environmental quality (TCEQ). Strong correlations and improved understanding of the relationships between satellite and ground observations are needed to formulate reliable real-time predictions of air quality using data accessed from the moderate resolution imaging spectroradiometer (MODIS) at the CSR direct-broadcast ground station. In this paper, improvements in these correlations are demonstrated first as a result of the evolution in the MODIS retrieval algorithms. Further improvement is then shown using procedures that compensate for differences in horizontal spatial scales between the nominal 10-km MODIS AOT products and CAMS point measurements. Finally, airborne light detection and ranging (lidar) observations, collected during the Texas Air Quality Study of 2000, are used to examine aerosol profile concentrations, which may vary greatly between aerosol classes as a result of the sources, chemical composition, and meteorological conditions that govern transport processes. Further improvement in correlations is demonstrated with this limited dataset using insights into aerosol profile information inferred from the vertical motion vectors in a trajectory-based forecast model. Analyses are ongoing to verify these procedures on a variety of aerosol classes using data collected by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite (Calipso) lidar.

  7. Exploitation of FTA cartridges for the sampling, long-term storage, and DNA-based analyses of plant-parasitic nematodes.

    Science.gov (United States)

    Marek, Martin; Zouhar, Miloslav; Douda, Ondřej; Maňasová, Marie; Ryšánek, Pavel

    2014-03-01

    The use of DNA-based analyses in molecular plant nematology research has dramatically increased over recent decades. Therefore, the development and adaptation of simple, robust, and cost-effective DNA purification procedures are required to address these contemporary challenges. The solid-phase-based approach developed by Flinders Technology Associates (FTA) has been shown to be a powerful technology for the preparation of DNA from different biological materials, including blood, saliva, plant tissues, and various human and plant microbial pathogens. In this work, we demonstrate, for the first time, that this FTA-based technology is a valuable, low-cost, and time-saving approach for the sampling, long-term archiving, and molecular analysis of plant-parasitic nematodes. Despite the complex structure and anatomical organization of the multicellular bodies of nematodes, we report the successful and reliable DNA-based analysis of nematode high-copy and low-copy genes using the FTA technology. This was achieved by applying nematodes to the FTA cards either in the form of a suspension of individuals, as intact or pestle-crushed nematodes, or by the direct mechanical printing of nematode-infested plant tissues. We further demonstrate that the FTA method is also suitable for the so-called "one-nematode-assay", in which the target DNA is typically analyzed from a single individual nematode. More surprisingly, a time-course experiment showed that nematode DNA can be detected specifically in the FTA-captured samples many years after initial sampling occurs. Collectively, our data clearly demonstrate the applicability and the robustness of this FTA-based approach for molecular research and diagnostics concerning phytonematodes; this research includes economically important species such as the stem nematode (Ditylenchus dipsaci), the sugar beet nematode (Heterodera schachtii), and the Northern root-knot nematode (Meloidogyne hapla).

  8. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  9. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-15

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the

  10. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    International Nuclear Information System (INIS)

    2009-11-01

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the canister is said to

  11. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  12. iTRAQ-Based Proteomics Analyses of Sterile/Fertile Anthers from a Thermo-Sensitive Cytoplasmic Male-Sterile Wheat with Aegilops kotschyi Cytoplasm

    Directory of Open Access Journals (Sweden)

    Gaoming Zhang

    2018-05-01

    Full Text Available A “two-line hybrid system” was developed, previously based on thermo-sensitive cytoplasmic male sterility in Aegilops kotschyi (K-TCMS, which can be used in wheat breeding. The K-TCMS line exhibits complete male sterility and it can be used to produce hybrid wheat seeds during the normal wheat-growing season; it propagates via self-pollination at high temperatures. Isobaric tags for relative and absolute quantification-based quantitative proteome and bioinformatics analyses of the TCMS line KTM3315A were conducted under different fertility conditions to understand the mechanisms of fertility conversion in the pollen development stages. In total, 4639 proteins were identified, the differentially abundant proteins that increased/decreased in plants with differences in fertility were mainly involved with energy metabolism, starch and sucrose metabolism, phenylpropanoid biosynthesis, protein synthesis, translation, folding, and degradation. Compared with the sterile condition, many of the proteins that related to energy and phenylpropanoid metabolism increased during the anther development stage. Thus, we suggest that energy and phenylpropanoid metabolism pathways are important for fertility conversion in K-TCMS wheat. These findings provide valuable insights into the proteins involved with anther and pollen development, thereby, helping to further understand the mechanism of TCMS in wheat.

  13. Clinical map document based on XML (cMDX: document architecture with mapping feature for reporting and analysing prostate cancer in radical prostatectomy specimens

    Directory of Open Access Journals (Sweden)

    Bettendorf Olaf

    2010-11-01

    Full Text Available Abstract Background The pathology report of radical prostatectomy specimens plays an important role in clinical decisions and the prognostic evaluation in Prostate Cancer (PCa. The anatomical schema is a helpful tool to document PCa extension for clinical and research purposes. To achieve electronic documentation and analysis, an appropriate documentation model for anatomical schemas is needed. For this purpose we developed cMDX. Methods The document architecture of cMDX was designed according to Open Packaging Conventions by separating the whole data into template data and patient data. Analogue custom XML elements were considered to harmonize the graphical representation (e.g. tumour extension with the textual data (e.g. histological patterns. The graphical documentation was based on the four-layer visualization model that forms the interaction between different custom XML elements. Sensible personal data were encrypted with a 256-bit cryptographic algorithm to avoid misuse. In order to assess the clinical value, we retrospectively analysed the tumour extension in 255 patients after radical prostatectomy. Results The pathology report with cMDX can represent pathological findings of the prostate in schematic styles. Such reports can be integrated into the hospital information system. "cMDX" documents can be converted into different data formats like text, graphics and PDF. Supplementary tools like cMDX Editor and an analyser tool were implemented. The graphical analysis of 255 prostatectomy specimens showed that PCa were mostly localized in the peripheral zone (Mean: 73% ± 25. 54% of PCa showed a multifocal growth pattern. Conclusions cMDX can be used for routine histopathological reporting of radical prostatectomy specimens and provide data for scientific analysis.

  14. Clinical map document based on XML (cMDX): document architecture with mapping feature for reporting and analysing prostate cancer in radical prostatectomy specimens.

    Science.gov (United States)

    Eminaga, Okyaz; Hinkelammert, Reemt; Semjonow, Axel; Neumann, Joerg; Abbas, Mahmoud; Koepke, Thomas; Bettendorf, Olaf; Eltze, Elke; Dugas, Martin

    2010-11-15

    The pathology report of radical prostatectomy specimens plays an important role in clinical decisions and the prognostic evaluation in Prostate Cancer (PCa). The anatomical schema is a helpful tool to document PCa extension for clinical and research purposes. To achieve electronic documentation and analysis, an appropriate documentation model for anatomical schemas is needed. For this purpose we developed cMDX. The document architecture of cMDX was designed according to Open Packaging Conventions by separating the whole data into template data and patient data. Analogue custom XML elements were considered to harmonize the graphical representation (e.g. tumour extension) with the textual data (e.g. histological patterns). The graphical documentation was based on the four-layer visualization model that forms the interaction between different custom XML elements. Sensible personal data were encrypted with a 256-bit cryptographic algorithm to avoid misuse. In order to assess the clinical value, we retrospectively analysed the tumour extension in 255 patients after radical prostatectomy. The pathology report with cMDX can represent pathological findings of the prostate in schematic styles. Such reports can be integrated into the hospital information system. "cMDX" documents can be converted into different data formats like text, graphics and PDF. Supplementary tools like cMDX Editor and an analyser tool were implemented. The graphical analysis of 255 prostatectomy specimens showed that PCa were mostly localized in the peripheral zone (Mean: 73% ± 25). 54% of PCa showed a multifocal growth pattern. cMDX can be used for routine histopathological reporting of radical prostatectomy specimens and provide data for scientific analysis.

  15. Simulation of the Impact of New Aircraft- and Satellite-Based Ocean Surface Wind Measurements on H*Wind Analyses and Numerical Forecasts

    Science.gov (United States)

    Miller, Timothy; Atlas, Robert; Black, Peter; Chen, Shuyi; Hood, Robbie; Johnson, James; Jones, Linwood; Ruf, Chris; Uhlhorn, Eric; Krishnamurti, T. N.; hide

    2009-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center, NOAA Hurricane Research Division, the University of Central Florida and the University of Michigan. HIRAD is being designed to enhance the realtime airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft using the operational airborne Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath ( 3 x the aircraft altitude). The present paper describes a set of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct H*Wind analyses. The H*Wind analysis, a product of the Hurricane Research Division of NOAA s Atlantic Oceanographic and Meteorological Laboratory, brings together wind measurements from a variety of observation platforms into an objective analysis of the distribution of wind speeds in a tropical cyclone. This product is designed to improve understanding of the extent and strength of the wind field, and to improve the assessment of hurricane intensity. See http://www.aoml.noaa.gov/hrd/data_sub/wind.html. Evaluations will be presented on the impact of the HIRAD instrument on H*Wind analyses, both in terms of adding it to the full suite of current measurements, as well as using it to replace instrument(s) that may not be functioning at the future time the HIRAD instrument is implemented. Also shown will be preliminary results of numerical weather prediction OSSEs in which the impact of the addition of HIRAD observations to the initial state

  16. Simulation of the Impact of New Aircraft-and Satellite-based Ocean Surface Wind Measurements on Wind Analyses and Numerical Forecasts

    Science.gov (United States)

    Miller, TImothy; Atlas, Robert; Black, Peter; Chen, Shuyi; Jones, Linwood; Ruf, Chris; Uhlhorn, Eric; Gamache, John; Amarin, Ruba; El-Nimri, Salem; hide

    2010-01-01

    The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for hurricane observations that is currently under development by NASA Marshall Space Flight Center, NOAA Hurricane Research Division, the University of Central Florida and the University of Michigan. HIRAD is being designed to enhance the realtime airborne ocean surface winds observation capabilities of NOAA and USAF Weather Squadron hurricane hunter aircraft currently using the operational airborne Stepped Frequency Microwave Radiometer (SFMR). Unlike SFMR, which measures wind speed and rain rate along the ground track directly beneath the aircraft, HIRAD will provide images of the surface wind and rain field over a wide swath (approx. 3 x the aircraft altitude). The present paper describes a set of Observing System Simulation Experiments (OSSEs) in which measurements from the new instrument as well as those from existing instruments (air, surface, and space-based) are simulated from the output of a detailed numerical model, and those results are used to construct H*Wind analyses, a product of the Hurricane Research Division of NOAA s Atlantic Oceanographic and Meteorological Laboratory. Evaluations will be presented on the impact of the HIRAD instrument on H*Wind analyses, both in terms of adding it to the full suite of current measurements, as well as using it to replace instrument(s) that may not be functioning at the future time the HIRAD instrument is implemented. Also shown will be preliminary results of numerical weather prediction OSSEs in which the impact of the addition of HIRAD observations to the initial state on numerical forecasts of the hurricane intensity and structure is assessed.

  17. Health technologies for the improvement of chronic disease management: a review of the Medical Advisory Secretariat evidence-based analyses between 2006 and 2011.

    Science.gov (United States)

    Nikitovic, M; Brener, S

    2013-01-01

    As part of ongoing efforts to improve the Ontario health care system, a mega-analysis examining the optimization of chronic disease management in the community was conducted by Evidence Development and Standards, Health Quality Ontario (previously known as the Medical Advisory Secretariat [MAS]). The purpose of this report was to identify health technologies previously evaluated by MAS that may be leveraged in efforts to optimize chronic disease management in the community. The Ontario Health Technology Assessment Series and field evaluations conducted by MAS and its partners between January 1, 2006, and December 31, 2011. Technologies related to at least 1 of 7 disease areas of interest (type 2 diabetes, coronary artery disease, atrial fibrillation, chronic obstructive pulmonary disease, congestive heart failure, stroke, and chronic wounds) or that may greatly impact health services utilization were reviewed. Only technologies with a moderate to high quality of evidence and associated with a clinically or statistically significant improvement in disease management were included. Technologies related to other topics in the mega-analysis on chronic disease management were excluded. Evidence-based analyses were reviewed, and outcomes of interest were extracted. Outcomes of interest included hospital utilization, mortality, health-related quality of life, disease-specific measures, and economic analysis measures. Eleven analyses were included and summarized. Technologies fell into 3 categories: those with evidence for the cure of chronic disease, those with evidence for the prevention of chronic disease, and those with evidence for the management of chronic disease. The impact on patient outcomes and hospitalization rates of new health technologies in chronic disease management is often overlooked. This analysis demonstrates that health technologies can reduce the burden of illness; improve patient outcomes; reduce resource utilization intensity; be cost

  18. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  19. Mechanical analyses on the digital behaviour of the Tokay gecko (Gekko gecko) based on a multi-level directional adhesion model.

    Science.gov (United States)

    Wu, Xuan; Wang, Xiaojie; Mei, Tao; Sun, Shaoming

    2015-07-08

    This paper proposes a multi-level hierarchical model for the Tokay gecko ( Gekko gecko ) adhesive system and analyses the digital behaviour of the G. gecko under macro/meso-level scale. The model describes the structures of G. gecko 's adhesive system from the nano-level spatulae to the sub-millimetre-level lamella. The G. gecko 's seta is modelled using inextensible fibril based on Euler's elastica theorem. Considering the side contact of the spatular pads of the seta on the flat and rigid substrate, the directional adhesion behaviour of the seta has been investigated. The lamella-induced attachment and detachment have been modelled to simulate the active digital hyperextension (DH) and the digital gripping (DG) phenomena. The results suggest that a tiny angular displacement within 0.25° of the lamellar proximal end is necessary in which a fast transition from attachment to detachment or vice versa is induced. The active DH helps release the torque to induce setal non-sliding detachment, while the DG helps apply torque to make the setal adhesion stable. The lamella plays a key role in saving energy during detachment to adapt to its habitat and provides another adhesive function which differs from the friction-dependent setal adhesion system controlled by the dynamic of G. gecko 's body.

  20. A phylogenetic re-appraisal of the family Liagoraceae sensu lato (Nemaliales, Rhodophyta) based on sequence analyses of two plastid genes and postfertilization development.

    Science.gov (United States)

    Lin, Showe-Mei; Rodríguez-Prieto, Conxi; Huisman, John M; Guiry, Michael D; Payri, Claude; Nelson, Wendy A; Liu, Shao-Lun

    2015-06-01

    The marine red algal family Liagoraceae sensu lato is shown to be polyphyletic based on analyses of a combined rbcL and psaA data set and the pattern of carposporophyte development. Fifteen of eighteen genera analyzed formed a monophyletic lineage that included the genus Liagora. Nemalion did not cluster with Liagoraceae sensu stricto, and Nemaliaceae is reinstated, characterized morphologically by the formation of the primary gonimolobes by longitudinal divisions of the gonimoblast initial. Yamadaella and Liagoropsis, previously placed in the Dermonemataceae, are shown to be independent lineages and are recognized as two new families Yamadaellaceae and Liagoropsidaceae. Yamadaellaceae is characterized by two gonimoblast initials cut off bilaterally from the fertilized carpogonium and diffusely spreading gonimoblast filaments. Liagoropsidaceae is characterized by at least three gonimoblast initials cut off by longitudinal septa from the fertilized carpogonium. In contrast, Liagoraceae sensu stricto is characterized by a single gonimoblast initial cut off transversely or diagonally from the fertilized carpogonium. Reproductive features, such as diffuse gonimoblasts and unfused carpogonial branches following postfertilization, appear to have evolved on more than one occasion in the Nemaliales and are therefore not taxonomically diagnostic at the family level, although they may be useful in recognizing genera. © 2015 Phycological Society of America.

  1. Population differentiation of the shore crab Carcinus maenas (Brachyura: Portunidae on the southwest English coast based on genetic and morphometric analyses

    Directory of Open Access Journals (Sweden)

    Inês C. Silva

    2010-08-01

    Full Text Available Carcinus maenas has a planktonic larval phase which can potentially disperse over large distances. Consequently, larval transport is expected to play an important role in promoting gene flow and determining population structure. In the present study, population structuring on the southwest coast of England was analysed using molecular and morphometric approaches. Variation at eight microsatellite loci suggested that the individuals sampled within this region comprise a single genetic population and that gene flow among them is not restricted. Nevertheless, the FST values estimated across loci for all populations suggested that the Tamar population was significantly different from the Exe, Camel and Torridge populations. This differentiation is not explained by isolation by distance, and coastal hydrological events that are apparently influencing larval flux might be the cause of this pattern. Morphometric analysis was also performed. Analysis of carapace and chela shape variation using landmark-based geometric morphometrics revealed extensive morphological variability, as the multivariate analysis of variance showed significant morphometric differences among geographic groups for both sexes. Thus, the morphological differentiation found may be a plastic response to habitat-specific selection pressures.

  2. Agent-based and phylogenetic analyses reveal how HIV-1 moves between risk groups: injecting drug users sustain the heterosexual epidemic in Latvia

    Science.gov (United States)

    Graw, Frederik; Leitner, Thomas; Ribeiro, Ruy M.

    2012-01-01

    Injecting drug users (IDU) are a driving force for the spread of HIV-1 in Latvia and other Baltic States, accounting for a majority of cases. However, in recent years, heterosexual cases have increased disproportionately. It is unclear how the changes in incidence patterns in Latvia can be explained, and how important IDU are for the heterosexual sub-epidemic. We introduce a novel epidemic model and use phylogenetic analyses in parallel to examine the spread of HIV-1 in Latvia between 1987 and 2010. Using a hybrid framework with a mean-field description for the susceptible population and an agent-based model for the infecteds, we track infected individuals and follow transmission histories dynamically formed during the simulation. The agent-based simulations and the phylogenetic analysis show that more than half of the heterosexual transmissions in Latvia were caused by IDU, which sustain the heterosexual epidemic. Indeed, we find that heterosexual clusters are characterized by short transmission chains with up to 63% of the chains dying out after the first introduction. In the simulations, the distribution of transmission chain sizes follows a power law distribution, which is confirmed by the phylogenetic data. Our models indicate that frequent introductions reduced the extinction probability of an autonomously spreading heterosexual HIV-1 epidemic, which now has the potential to dominate the spread of the overall epidemic in the future. Furthermore, our model shows that social heterogeneity of the susceptible population can explain the shift in HIV-1 incidence in Latvia over the course of the epidemic. Thus, the decrease in IDU incidence may be due to local heterogeneities in transmission, rather than the implementation of control measures. Increases in susceptibles, through social or geographic movement of IDU, could lead to a boost in HIV-1 infections in this risk group. Targeting individuals that bridge social groups would help prevent further spread of the

  3. Metagenome-based diversity analyses suggest a strong locality signal for bacterial communities associated with oyster aquaculture farms in Ofunato Bay

    KAUST Repository

    Kobiyama, Atsushi

    2018-04-30

    Ofunato Bay, in Japan, is the home of buoy-and-rope-type oyster aquaculture activities. Since the oysters filter suspended materials and excrete organic matter into the seawater, bacterial communities residing in its vicinity may show dynamic changes depending on the oyster culture activities. We employed a shotgun metagenomic technique to study bacterial communities near oyster aquaculture facilities at the center of the bay (KSt. 2) and compared the results with those of two other localities far from the station, one to the northeast (innermost bay, KSt. 1) and the other to the southwest (bay entrance, KSt. 3). Seawater samples were collected every month from January to December 2015 from the surface (1 m) and deeper (8 or 10 m) layers of the three locations, and the sequentially filtered fraction on 0.2-μm membranes was sequenced on an Illumina MiSeq system. The acquired reads were uploaded to MG-RAST for KEGG functional abundance analysis, while taxonomic analyses at the phylum and genus levels were performed using MEGAN after parsing the BLAST output. Discrimination analyses were then performed using the ROC-AUC value of the cross validation, targeting the depth (shallow or deep), locality [(KSt. 1 + KSt. 2) vs. KSt 3; (KSt. 1 + KSt. 3) vs. KSt. 2 or the (KSt. 2 + KSt. 3) vs. KSt. 1] and seasonality (12 months). The matrix discrimination analysis on the adjacent 2 continuous seasons by ROC-AUC, which was based on the datasets that originated from different depths, localities and months, showed the strongest discrimination signal on the taxonomy matrix at the phylum level for the datasets from July to August compared with those from September to June, while the KEGG matrix showed the strongest signal for the datasets from March to June compared with those from July to February. Then, the locality combination was subjected to the same ROC-AUC discrimination analysis, resulting in significant differences between KSt. 2 and KSt. 1 + KSt. 3

  4. Metagenome-based diversity analyses suggest a strong locality signal for bacterial communities associated with oyster aquaculture farms in Ofunato Bay

    KAUST Repository

    Kobiyama, Atsushi; Ikeo, Kazuho; Reza, Md. Shaheed; Rashid, Jonaira; Yamada, Yuichiro; Ikeda, Yuri; Ikeda, Daisuke; Mizusawa, Nanami; Sato, Shigeru; Ogata, Takehiko; Jimbo, Mitsuru; Kudo, Toshiaki; Kaga, Shinnosuke; Watanabe, Shiho; Naiki, Kimiaki; Kaga, Yoshimasa; Mineta, Katsuhiko; Bajic, Vladimir B.; Gojobori, Takashi; Watabe, Shugo

    2018-01-01

    Ofunato Bay, in Japan, is the home of buoy-and-rope-type oyster aquaculture activities. Since the oysters filter suspended materials and excrete organic matter into the seawater, bacterial communities residing in its vicinity may show dynamic changes depending on the oyster culture activities. We employed a shotgun metagenomic technique to study bacterial communities near oyster aquaculture facilities at the center of the bay (KSt. 2) and compared the results with those of two other localities far from the station, one to the northeast (innermost bay, KSt. 1) and the other to the southwest (bay entrance, KSt. 3). Seawater samples were collected every month from January to December 2015 from the surface (1 m) and deeper (8 or 10 m) layers of the three locations, and the sequentially filtered fraction on 0.2-μm membranes was sequenced on an Illumina MiSeq system. The acquired reads were uploaded to MG-RAST for KEGG functional abundance analysis, while taxonomic analyses at the phylum and genus levels were performed using MEGAN after parsing the BLAST output. Discrimination analyses were then performed using the ROC-AUC value of the cross validation, targeting the depth (shallow or deep), locality [(KSt. 1 + KSt. 2) vs. KSt 3; (KSt. 1 + KSt. 3) vs. KSt. 2 or the (KSt. 2 + KSt. 3) vs. KSt. 1] and seasonality (12 months). The matrix discrimination analysis on the adjacent 2 continuous seasons by ROC-AUC, which was based on the datasets that originated from different depths, localities and months, showed the strongest discrimination signal on the taxonomy matrix at the phylum level for the datasets from July to August compared with those from September to June, while the KEGG matrix showed the strongest signal for the datasets from March to June compared with those from July to February. Then, the locality combination was subjected to the same ROC-AUC discrimination analysis, resulting in significant differences between KSt. 2 and KSt. 1 + KSt. 3

  5. The effectiveness of home-based HIV counseling and testing on reducing stigma and risky sexual behavior among adults and adolescents: A systematic review and meta-analyses.

    Science.gov (United States)

    Feyissa, Garumma Tolu; Lockwood, Craig; Munn, Zachary

    2015-07-17

    -analysis software provided by Joanna Briggs Institute. Effect sizes were calculated using fixed effects model. Where the findings could not be pooled using meta-analyses, results were presented in a narrative form. Nine studies were included in this review, five of them reporting on stigma and related outcomes, three of them on sexual behavior and four of them on clinical outcomes. Meta-analysis indicated that the risk of observing any stigmatizing behavior in the community was 16% (RR=0.84, 95% CI 0.79 to 0.89] lower among the participants exposed to home-based HCT when compared to the risk among those participants not exposed to home-based HCT. The risk of experiencing any stigmatizing behavior by HIV positive patients was 37% (RR 0.63, 95% CI 0.45 to 0.88) lower among the intervention population compared to the risk among the control population. The risk of intimate partner violence was 34% (RR 0.66, 95% CI 0.49 to 0.89) lower among participants exposed to home-based HCT when compared to the risk among participants in the control arm. Compared to the control arm, the risk of reporting more than one sexual partner was 58% (RR 0.42, 95% CI 0.31 to 0.58) lower among participants exposed to home-based HCT. The risk of having any casual sexual partner in the past three months was 51% (RR 0.49, 95% CI 0.40 to 0.59) lower among the population exposed to home-based HCT when compared to the risk among those participants not exposed to home-based HCT. The risk of having ever been forced for sex among participants exposed to home-based HCT was 20% (RR 0.8, 0.56 to 1.14) lower when compared to the risk among the control arm; however this result was not statistically significant and the wide confidence interval indicates that the risk estimate was imprecise. Home-based HCT is protective against intimate partner violence, stigmatizing behavior, having multiple sexual partners, and having casual sexual partners. The low quality of studies included makes it difficult to formulate clear

  6. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  7. Regional variation of flow duration curves in the eastern United States: Process-based analyses of the interaction between climate and landscape properties

    Science.gov (United States)

    Chouaib, Wafa; Caldwell, Peter V.; Alila, Younes

    2018-04-01

    This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the Sacramento model (SAC-SMA) to simulate soil moisture and flow components FDCs. The catchments classification based on storm characteristics pointed to the effect of catchments landscape properties on the precipitation variability and consequently on the FDC shapes. The landscape properties effect was pronounce such that low value of the slope of FDC (SFDC)-hinting at limited flow variability-were present in regions of high precipitation variability. Whereas, in regions with low precipitation variability the SFDCs were of larger values. The topographic index distribution, at the catchment scale, indicated that saturation excess overland flow mitigated the flow variability under conditions of low elevations with large soil moisture storage capacity and high infiltration rates. The SFDCs increased due to the predominant subsurface stormflow in catchments at high elevations with limited soil moisture storage capacity and low infiltration rates. Our analyses also highlighted the major role of soil infiltration rates on the FDC despite the impact of the predominant runoff generation mechanism and catchment elevation. In conditions of slow infiltration rates in soils of large moisture storage capacity (at low elevations) and predominant saturation excess, the SFDCs were of larger values. On the other hand, the SFDCs decreased in catchments of prevalent subsurface stormflow and poorly drained soils of small soil moisture storage capacity. The analysis of the flow components FDCs demonstrated that the interflow contribution to the response was the higher in catchments with large value of slope of the FDC. The surface flow

  8. RNA-sequencing-based transcriptome and biochemical analyses of steroidal saponin pathway in a complete set of Allium fistulosum—A. cepa monosomic addition lines

    Science.gov (United States)

    Abdelrahman, Mostafa; El-Sayed, Magdi; Sato, Shusei; Hirakawa, Hideki; Ito, Shin-ichi; Tanaka, Keisuke; Mine, Yoko; Sugiyama, Nobuo; Suzuki, Minoru; Yamauchi, Naoki

    2017-01-01

    The genus Allium is a rich source of steroidal saponins, and its medicinal properties have been attributed to these bioactive compounds. The saponin compounds with diverse structures play a pivotal role in Allium’s defense mechanism. Despite numerous studies on the occurrence and chemical structure of steroidal saponins, their biosynthetic pathway in Allium species is poorly understood. The monosomic addition lines (MALs) of the Japanese bunching onion (A. fistulosum, FF) with an extra chromosome from the shallot (A. cepa Aggregatum group, AA) are powerful genetic resources that enable us to understand many physiological traits of Allium. In the present study, we were able to isolate and identify Alliospiroside A saponin compound in A. fistulosum with extra chromosome 2A from shallot (FF2A) and its role in the defense mechanism against Fusarium pathogens. Furthermore, to gain molecular insight into the Allium saponin biosynthesis pathway, high-throughput RNA-Seq of the root, bulb, and leaf of AA, MALs, and FF was carried out using Illumina's HiSeq 2500 platform. An open access Allium Transcript Database (Allium TDB, http://alliumtdb.kazusa.or.jp) was generated based on RNA-Seq data. The resulting assembled transcripts were functionally annotated, revealing 50 unigenes involved in saponin biosynthesis. Differential gene expression (DGE) analyses of AA and MALs as compared with FF (as a control) revealed a strong up-regulation of the saponin downstream pathway, including cytochrome P450, glycosyltransferase, and beta-glucosidase in chromosome 2A. An understanding of the saponin compounds and biosynthesis-related genes would facilitate the development of plants with unique saponin content and, subsequently, improved disease resistance. PMID:28800607

  9. RNA-sequencing-based transcriptome and biochemical analyses of steroidal saponin pathway in a complete set of Allium fistulosum-A. cepa monosomic addition lines.

    Science.gov (United States)

    Abdelrahman, Mostafa; El-Sayed, Magdi; Sato, Shusei; Hirakawa, Hideki; Ito, Shin-Ichi; Tanaka, Keisuke; Mine, Yoko; Sugiyama, Nobuo; Suzuki, Yutaka; Yamauchi, Naoki; Shigyo, Masayoshi

    2017-01-01

    The genus Allium is a rich source of steroidal saponins, and its medicinal properties have been attributed to these bioactive compounds. The saponin compounds with diverse structures play a pivotal role in Allium's defense mechanism. Despite numerous studies on the occurrence and chemical structure of steroidal saponins, their biosynthetic pathway in Allium species is poorly understood. The monosomic addition lines (MALs) of the Japanese bunching onion (A. fistulosum, FF) with an extra chromosome from the shallot (A. cepa Aggregatum group, AA) are powerful genetic resources that enable us to understand many physiological traits of Allium. In the present study, we were able to isolate and identify Alliospiroside A saponin compound in A. fistulosum with extra chromosome 2A from shallot (FF2A) and its role in the defense mechanism against Fusarium pathogens. Furthermore, to gain molecular insight into the Allium saponin biosynthesis pathway, high-throughput RNA-Seq of the root, bulb, and leaf of AA, MALs, and FF was carried out using Illumina's HiSeq 2500 platform. An open access Allium Transcript Database (Allium TDB, http://alliumtdb.kazusa.or.jp) was generated based on RNA-Seq data. The resulting assembled transcripts were functionally annotated, revealing 50 unigenes involved in saponin biosynthesis. Differential gene expression (DGE) analyses of AA and MALs as compared with FF (as a control) revealed a strong up-regulation of the saponin downstream pathway, including cytochrome P450, glycosyltransferase, and beta-glucosidase in chromosome 2A. An understanding of the saponin compounds and biosynthesis-related genes would facilitate the development of plants with unique saponin content and, subsequently, improved disease resistance.

  10. Unique honey bee (Apis mellifera hive component-based communities as detected by a hybrid of phospholipid fatty-acid and fatty-acid methyl ester analyses.

    Directory of Open Access Journals (Sweden)

    Kirk J Grubbs

    Full Text Available Microbial communities (microbiomes are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME and phospholipid-derived fatty acid (PLFA analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

  11. Unique honey bee (Apis mellifera) hive component-based communities as detected by a hybrid of phospholipid fatty-acid and fatty-acid methyl ester analyses.

    Science.gov (United States)

    Grubbs, Kirk J; Scott, Jarrod J; Budsberg, Kevin J; Read, Harry; Balser, Teri C; Currie, Cameron R

    2015-01-01

    Microbial communities (microbiomes) are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME) and phospholipid-derived fatty acid (PLFA) analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

  12. Comparison of Adjuvant Radiation Therapy Alone and Chemotherapy Alone in Surgically Resected Low-Grade Gliomas: Survival Analyses of 2253 Cases from the National Cancer Data Base.

    Science.gov (United States)

    Wu, Jing; Neale, Natalie; Huang, Yuqian; Bai, Harrison X; Li, Xuejun; Zhang, Zishu; Karakousis, Giorgos; Huang, Raymond; Zhang, Paul J; Tang, Lei; Xiao, Bo; Yang, Li

    2018-04-01

    It is becoming increasingly common to incorporate chemotherapy (CT) with radiotherapy (RT) in the treatment of low-grade gliomas (LGGs) after surgical resection. However, there is a lack of literature comparing survival of patients who underwent RT or CT alone. The U.S. National Cancer Data Base was used to identify patients with histologically confirmed, World Health Organization grade 2 gliomas who received either RT alone or CT alone after surgery from 2004 to 2013. Overall survival (OS) was evaluated by Kaplan-Meier analysis, multivariable Cox proportional hazard regression, and propensity-score-matched analysis. In total, 2253 patients with World Health Organization grade 2 gliomas were included, of whom 1466 (65.1%) received RT alone and 787 (34.9%) CT alone. The median OS was 98.9 months for the RT alone group and 125.8 months for the CT alone group. On multivariable analysis, CT alone was associated with a significant OS benefit compared with RT alone (hazard ratio [HR], 0.405; 95% confidence interval, 0.277-0.592; P < 0.001). On subgroup analyses, the survival advantage of CT alone over RT alone persisted across all age groups, and for the subtotal resection and biopsy groups, but not in the gross total resection group. In propensity-score-matched analysis, CT alone still showed significantly improved OS compared with RT alone (HR, 0.612; 95% confidence interval, 0.506-0.741; P < 0.001). Our results suggest that CT alone was independently associated with longer OS compared with RT alone in patients with LGGs who underwent surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. The impact of ancestral heath management on soils and landscapes. A reconstruction based on paleoecological analyses of soil records in the middle and southeast Netherlands.

    Science.gov (United States)

    van Mourik, Jan; Doorenbosch, Marieke

    2016-04-01

    The evolution of heath lands during the Holocene has been registered in various soil records . Paleoecological analyses of these records enable to reconstruct the changing economic and cultural management of heaths and the consequences for landscape and soils. Heaths are characteristic components of cultural landscape mosaics on sandy soils in the Netherlands. The natural habitat of heather species was moorland. At first, natural events like forest fires and storms caused small-scale forest degradation, in addition on the forest degradation accelerated due to cultural activities like forest grazing, wood cutting and shifting cultivation. Heather plants invaded on degraded forest soils and heaths developed. People learned to use the heaths for economic and cultural purposes. The impact of the heath management on landscape and soils was registered in soil records of barrows, drift sand sequences and plaggic Anthrosols. Based on pollen diagrams of such records we could reconstruct that heaths were developed and used for cattle grazing before the Bronze Age. During the Late Neolithic, the Bronze Age and Iron Age, people created the barrow landscape on the ancestral heaths. After the Iron Age people probably continued with cattle grazing on the heaths and plaggic agriculture until the Early Middle Ages. After 1000 AD two events affected the heaths. At first deforestation for the sale of wood resulted in the first regional extension of sand drifting and heath degradation. After that the introduction of the deep stable economy and heath sods digging resulted in acceleration of the rise of plaggic horizons, severe heath degradation and the second extension of sand drifting. At the end of the 19th century the heath lost its economic value due to the introduction of chemical fertilizers. The heaths were transformed into 'new' arable fields and forests and due to deep ploughing most soil archives were destroyed. Since 1980 AD, the remaining relicts of the ancestral heaths are

  14. Family-based Association Analyses of Imputed Genotypes Reveal Genome-Wide Significant Association of Alzheimer’s disease with OSBPL6, PTPRG and PDCL3

    Science.gov (United States)

    Herold, Christine; Hooli, Basavaraj V.; Mullin, Kristina; Liu, Tian; Roehr, Johannes T; Mattheisen, Manuel; Parrado, Antonio R.; Bertram, Lars; Lange, Christoph; Tanzi, Rudolph E.

    2015-01-01

    The genetic basis of Alzheimer's disease (AD) is complex and heterogeneous. Over 200 highly penetrant pathogenic variants in the genes APP, PSEN1 and PSEN2 cause a subset of early-onset familial Alzheimer's disease (EOFAD). On the other hand, susceptibility to late-onset forms of AD (LOAD) is indisputably associated to the ε4 allele in the gene APOE, and more recently to variants in more than two-dozen additional genes identified in the large-scale genome-wide association studies (GWAS) and meta-analyses reports. Taken together however, although the heritability in AD is estimated to be as high as 80%, a large proportion of the underlying genetic factors still remain to be elucidated. In this study we performed a systematic family-based genome-wide association and meta-analysis on close to 15 million imputed variants from three large collections of AD families (~3,500 subjects from 1,070 families). Using a multivariate phenotype combining affection status and onset age, meta-analysis of the association results revealed three single nucleotide polymorphisms (SNPs) that achieved genome-wide significance for association with AD risk: rs7609954 in the gene PTPRG (P-value = 3.98·10−08), rs1347297 in the gene OSBPL6 (P-value = 4.53·10−08), and rs1513625 near PDCL3 (P-value = 4.28·10−08). In addition, rs72953347 in OSBPL6 (P-value = 6.36·10−07) and two SNPs in the gene CDKAL1 showed marginally significant association with LOAD (rs10456232, P-value: 4.76·10−07; rs62400067, P-value: 3.54·10−07). In summary, family-based GWAS meta-analysis of imputed SNPs revealed novel genomic variants in (or near) PTPRG, OSBPL6, and PDCL3 that influence risk for AD with genome-wide significance. PMID:26830138

  15. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences

  16. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  17. Quantitative Analyses about Market- and Prevalence-Based Needs for Adapted Physical Education Teachers in the Public Schools in the United States

    Science.gov (United States)

    Zhang, Jiabei

    2011-01-01

    The purpose of this study was to analyze quantitative needs for more adapted physical education (APE) teachers based on both market- and prevalence-based models. The market-based need for more APE teachers was examined based on APE teacher positions funded, while the prevalence-based need for additional APE teachers was analyzed based on students…

  18. The impact of ancestral heath management on soils and landscapes: a reconstruction based on paleoecological analyses of soil records in the central and southeastern Netherlands

    Science.gov (United States)

    Doorenbosch, Marieke; van Mourik, Jan M.

    2016-07-01

    The evolution of heathlands during the Holocene has been registered in various soil records. Paleoecological analyses of these records enable reconstruction of the changing economic and cultural management of heaths and the consequences for landscape and soils. Heaths are characteristic components of cultural landscape mosaics on sandy soils in the Netherlands. The natural habitat of heather species was moorland. At first, natural events like forest fires and storms caused small-scale forest degradation; in addition on that, the forest degradation accelerated due to cultural activities like forest grazing, wood cutting, and shifting cultivation. Heather plants invaded degraded forest soils, and heaths developed. People learned to use the heaths for economic and cultural purposes. The impact of the heath management on landscape and soils was registered in soil records of barrows, drift sand sequences, and plaggic Anthrosols. Based on pollen diagrams of such records we could reconstruct that heaths were developed and used for cattle grazing before the Bronze Age. During the late Neolithic, the Bronze Age, and Iron Age, people created the barrow landscape on the ancestral heaths. After the Iron Age, people probably continued with cattle grazing on the heaths and plaggic agriculture until the early Middle Ages. Severe forest degradation by the production of charcoal for melting iron during the Iron Age till the 6th-7th century and during the 11th-13th century for the trade of wood resulted in extensive sand drifting, a threat to the valuable heaths. The introduction of the deep, stable economy and heath sods digging in the course of the 18th century resulted in acceleration of the rise of plaggic horizons, severe heath degradation, and again extension of sand drifting. At the end of the 19th century heath lost its economic value due to the introduction of chemical fertilizers. The heaths were transformed into "new" arable fields and forests, and due to deep ploughing

  19. The Potential to Forgo Social Welfare Gains through Over reliance on Cost Effectiveness/Cost Utility Analyses in the Evidence Base for Public Health

    International Nuclear Information System (INIS)

    Cohen, D.R.; Patel, N.

    2010-01-01

    Economic evaluations of clinical treatments most commonly take the form of cost effectiveness or cost utility analyses. This is appropriate since the main sometimes the only benefit of such interventions is increased health. The majority of economic evaluations in public health, however, have also been assessed using these techniques when arguably cost benefit analyses would in many cases have been more appropriate, given its ability to take account of non health benefits as well. An examination of the non health benefits from a sample of studies featured in a recent review of economic evaluations in public health illustrates how over focusing on cost effectiveness/cost utility analyses may lead to forgoing potential social welfare gains from programmes in public health. Prior to evaluation, programmes should be considered in terms of the potential importance of non health benefits and where these are considerable would be better evaluated by more inclusive economic evaluation techniques.

  20. The role of safety analyses in site selection. Some personal observations based on the experience from the Swiss site selection process

    Energy Technology Data Exchange (ETDEWEB)

    Zuidema, Piet [Nagra, Wettingen (Switzerland)

    2015-07-01

    In Switzerland, the site selection process according to the ''Sectoral Plan for Deep Geological Repositories'' (BFE 2008) is underway since 2008. This process takes place in three stages. In stage 1 geological siting regions (six for the L/ILW repository and three for the HLW repository) have been identified, in stage 2 sites for the surface facilities have been identified for all siting regions in close co-operation with the sting regions and a narrowing down of the number of siting regions based on geological criteria will take place. In stage 3 the sites for a general license application are selected and the general license applications will be submitted which eventually will lead to the siting decision for both repository types. In the Swiss site selection process, safety has the highest priority. Many factors affect safety and thus a whole range of safety-related issues are considered in the identification and screening of siting possibilities. Besides dose calculations a range of quantitative and qualitative issues are considered. Dose calculations are performed in all three stages of the site selection process. In stage 1 generic safety calculations were made to develop criteria to be used for the identification of potential siting regions. In stage 2, dose calculations are made for comparing the different siting regions according to a procedure prescribed in detail by the regulator. Combined with qualitative evaluations this will lead to a narrowing down of the number of siting regions to at least two siting regions for each repository type. In stage 3 full safety cases will be prepared as part of the documentation for the general license applications. Besides the dose calculations, many other issues related to safety are analyzed in a quantitative and qualitative manner. These consider the 13 criteria defined in the Sectoral Plan and the corresponding indicators. The features analyzed cover the following broad themes: efficiency of

  1. The role of safety analyses in site selection. Some personal observations based on the experience from the Swiss site selection process

    International Nuclear Information System (INIS)

    Zuidema, Piet

    2015-01-01

    In Switzerland, the site selection process according to the ''Sectoral Plan for Deep Geological Repositories'' (BFE 2008) is underway since 2008. This process takes place in three stages. In stage 1 geological siting regions (six for the L/ILW repository and three for the HLW repository) have been identified, in stage 2 sites for the surface facilities have been identified for all siting regions in close co-operation with the sting regions and a narrowing down of the number of siting regions based on geological criteria will take place. In stage 3 the sites for a general license application are selected and the general license applications will be submitted which eventually will lead to the siting decision for both repository types. In the Swiss site selection process, safety has the highest priority. Many factors affect safety and thus a whole range of safety-related issues are considered in the identification and screening of siting possibilities. Besides dose calculations a range of quantitative and qualitative issues are considered. Dose calculations are performed in all three stages of the site selection process. In stage 1 generic safety calculations were made to develop criteria to be used for the identification of potential siting regions. In stage 2, dose calculations are made for comparing the different siting regions according to a procedure prescribed in detail by the regulator. Combined with qualitative evaluations this will lead to a narrowing down of the number of siting regions to at least two siting regions for each repository type. In stage 3 full safety cases will be prepared as part of the documentation for the general license applications. Besides the dose calculations, many other issues related to safety are analyzed in a quantitative and qualitative manner. These consider the 13 criteria defined in the Sectoral Plan and the corresponding indicators. The features analyzed cover the following broad themes: efficiency of

  2. Heat integration options based on pinch and exergy analyses of a thermosolar and heat pump in a fish tinning industrial process

    International Nuclear Information System (INIS)

    Quijera, José Antonio; García, Araceli; Alriols, María González; Labidi, Jalel

    2013-01-01

    Thermosolar technology is being inserted gradually in industrial activities. In order to reach high energy efficiency, thermosolar can be linked to heat pump technology, combining more efficient conventional and renewable energy support for processes. Their integration in complex processes can be improved systematically through well established analytical tools, like pinch and exergy analyses. This work presents a methodological procedure for the analysis of different options of heat integration of a solar thermal and heat pump technologies in a tuna fish tinning process. The plant is located in a climatic zone where diffuse irradiation contributes more energy to the process than beam irradiation does. Pinch and exergy analyses are applied in the context of a low and middle temperatures, where the process demands big amounts of hot water and middle pressure steam. In order to recover internal heat, pinch analysis allows to understand the complexity of the heat exchange network of the process and to define thermal tendency objectives for energy optimization. Exergy analysis quantifies the variation that the quality of energy undergoes while it is used in the process according to the different way of integration. Both analytical tools, in combination with economical variables, provide a powerful methodological procedure finding the most favourable heat integration and, by this, they help in the technological decision making and in the design phase. - Highlights: ► Integration of solar thermal energy in batch canning process was assessed. ► Pinch and exergy analyses were used to determine the optimal energy supply configuration. ► Combination of heat pump and solar thermal energy improves the energy efficiency and reduces fossil fuel consumption

  3. Assessment of anti-inflammatory and anti-arthritic properties of Acmella uliginosa (Sw. Cass. based on experiments in arthritic rat models and qualitative GC/MS analyses.

    Directory of Open Access Journals (Sweden)

    Subhashis Paul

    2016-09-01

    of AU and AV showed the best recovery potential in all the studied parameters, confirming the synergistic efficacy of the herbal formulation. GC/MS analyses revealed the presence of at least 5 anti-inflammatory compounds including 9-octadecenoic acid (Z-, phenylmethyl ester, astaxanthin, à-N-Normethadol, fenretinide that have reported anti-inflammatory/anti-arthritic properties. Conclusion: Our findings indicated that the crude flower homogenate of AU contains potential anti-inflammatory compounds which could be used as an anti-inflammatory/anti-arthritic medication. [J Complement Med Res 2016; 5(3.000: 257-262

  4. The validity of using ROC software for analysing visual grading characteristics data: an investigation based on the novel software VGC analyzer

    International Nuclear Information System (INIS)

    Hansson, Jonny; Maansson, Lars Gunnar; Baath, Magnus

    2016-01-01

    The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUC VGC ) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUC VGC . For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. (authors)

  5. Biomonitoring in a clean and a multi-contaminated estuary based on biomarkers and chemical analyses in the endobenthic worm Nereis diversicolor

    Energy Technology Data Exchange (ETDEWEB)

    Durou, Cyril [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France) and Institut de Biologie et Ecologie Appliquees, CEREA, Universite Catholique de l' Ouest, 44 rue Rabelais, 49008 Angers Cedex 01 (France)]. E-mail: cyril.durou@uco.fr; Poirier, Laurence [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Amiard, Jean-Claude [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Budzinski, Helene [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Gnassia-Barelli, Mauricette [UMR INRA UNSA 1112 ROSE, Faculte des Sciences, BP 71, 06108 Nice Cedex 2 (France); Lemenach, Karyn [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Peluhet, Laurent [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Mouneyrac, Catherine [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Institut de Biologie et Ecologie Appliquees, CEREA, Universite Catholique de l' Ouest, 44 rue Rabelais, 49008 Angers Cedex 01 (France); Romeo, Michele [UMR INRA UNSA 1112 ROSE, Faculte des Sciences, BP 71, 06108 Nice Cedex 2 (France); Amiard-Triquet, Claude [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France)

    2007-07-15

    Relationships between biochemical and physiological biomarkers (acetylcholinesterase [AChE], catalase, and glutathione S-transferase [GST] activities, thiobarbituric acid reactive substances, glycogen, lipids and proteins) and accumulated concentrations of contaminants (polychlorinated biphenyls [PCBs], polycyclic aromatic hydrocarbons and metals) were examined in the keystone species Nereis diversicolor. The chemical analyses of worms and sediments allowed the designation of the Seine estuary and the Authie estuary as a polluted and relatively clean site respectively. Worms from the Seine estuary exhibited higher GST and lower AChE activities. Generally, larger worms had higher concentrations of energy reserves. Principal component analyses clearly highlighted intersite differences: in the first plan, GST activities and chemical concentrations were inversely related to concentrations of energy reserves; in the second one, PCB concentrations and AChE activity were inversely related. Depleted levels of energy reserves could be a consequence of combating toxicants and might predict effects at higher levels of biological organization. The use of GST and AChE activities and energy reserve concentrations as biomarkers is validated in the field in this keystone species. - The use of N. diversicolor as a biomonitor of environmental quality via the measurement of biomarkers and accumulated concentrations of contaminants is validated in the field.

  6. Head and neck tumours: combined MRI assessment based on IVIM and TIC analyses for the differentiation of tumors of different histological types

    International Nuclear Information System (INIS)

    Sumi, Misa; Nakamura, Takashi

    2014-01-01

    We evaluated the combined use of intravoxel incoherent motion (IVIM) and time-signal intensity curve (TIC) analyses to diagnose head and neck tumours. We compared perfusion-related parameters (PP) and molecular diffusion values (D) determined from IVIM theory and TIC profiles among 92 tumours with different histologies. IVIM parameters (f and D values) and TIC profiles in combination were distinct among the different types of head and neck tumours, including squamous cell carcinomas (SCCs), lymphomas, malignant salivary gland tumours, Warthin's tumours, pleomorphic adenomas and schwannomas. A multiparametric approach using both IVIM parameters and TIC profiles differentiated between benign and malignant tumours with 97 % accuracy and diagnosed different tumour types with 89 % accuracy. Combined use of IVIM parameters and TIC profiles has high efficacy in diagnosing head and neck tumours. (orig.)

  7. Co-combustion characteristics and blending optimization of tobacco stem and high-sulfur bituminous coal based on thermogravimetric and mass spectrometry analyses.

    Science.gov (United States)

    Zhang, Kaihua; Zhang, Kai; Cao, Yan; Pan, Wei-ping

    2013-03-01

    Despite much research on co-combustion of tobacco stem and high-sulfur coal, their blending optimization has not been effectively found. This study investigated the combustion profiles of tobacco stem, high-sulfur bituminous coal and their blends by thermogravimetric analysis. Ignition and burnout performances, heat release performances, and gaseous pollutant emissions were also studied by thermogravimetric and mass spectrometry analyses. The results indicated that combustion of tobacco stem was more complicated than that of high-sulfur bituminous coal, mainly shown as fixed carbon in it was divided into two portions with one early burning and the other delay burning. Ignition and burnout performances, heat release performances, and gaseous pollutant emissions of the blends present variable trends with the increase of tobacco stem content. Taking into account the above three factors, a blending ratio of 0–20% tobacco stem content is conservatively proposed as optimum amount for blending. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. New nuclear data group constant sets for fusion reactor nuclear analyses based on JENDL-4.0 and FENDL-3.0

    International Nuclear Information System (INIS)

    Konno, Chikara; Ohta, Masayuki; Kwon, Saerom; Ochiai, Kentaro; Sato, Satoshi

    2015-01-01

    We have produced new nuclear data group constant sets from JENDL-4.0 and FENDL-3.0 for fusion reactor nuclear analyses; FUSION-J40-175, FUSION-F30-175 (40 materials, neutron 175 groups, gamma 42 groups), FUSION-J40-42 and FUSION-F30-42 (40 materials, neutron 42 groups, gamma 21 groups). MATXS files of JENDL-4.0 and FENDL-3.0 were newly produced with the NJOY2012 code. FUSION-J40-175, FUSION-J40-42, FUSION-F30-175 and FUSION-F30-42 were produced with the TRANSX code. KERMA factors, DPA and gas production cross-section data were also prepared from the MATXS files with TRANSX. Test calculations were carried out in order to validate these nuclear group constant sets. They suggested that these group constant sets had no problem. (author)

  9. Towards the Development of Clinical Measures for Spinal Cord Injury Based on the International Classification of Functioning, Disability and Health With Rasch Analyses

    DEFF Research Database (Denmark)

    Ballert, Carolina S; Stucki, Gerold; Biering-Sørensen, Fin

    2014-01-01

    item dependency was observed between ICF categories of the same chapters. Group effects for age and sex were observed only to a small extent. CONCLUSIONS: The validity of ICF categories to develop measures of functioning in SCI for clinical practice and research is to some extent supported. Model......OBJECTIVES: To determine whether the International Classification of Functioning, Disability and Health (ICF) categories relevant to spinal cord injury (SCI) can be integrated in clinical measures and to obtain insights to guide their future operationalization. Specific aims are to find out whether...... in specialized centers within 15 countries from 2006 through 2008. SETTING: Secondary data analysis. PARTICIPANTS: Adults (N=1048) with SCI from the early postacute and long-term living context. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: Two unidimensional Rasch analyses: one for the ICF categories...

  10. Determinants of the over-anticoagulation response during warfarin initiation therapy in Asian patients based on population pharmacokinetic-pharmacodynamic analyses.

    Science.gov (United States)

    Ohara, Minami; Takahashi, Harumi; Lee, Ming Ta Michael; Wen, Ming-Shien; Lee, Tsong-Hai; Chuang, Hui-Ping; Luo, Chen-Hui; Arima, Aki; Onozuka, Akiko; Nagai, Rui; Shiomi, Mari; Mihara, Kiyoshi; Morita, Takashi; Chen, Yuan-Tsong

    2014-01-01

    To clarify pharmacokinetic-pharmacodynamic (PK-PD) factors associated with the over-anticoagulation response in Asians during warfarin induction therapy, population PK-PD analyses were conducted in an attempt to predict the time-courses of the plasma S-warfarin concentration, Cp(S), and coagulation and anti-coagulation (INR) responses. In 99 Chinese patients we analyzed the relationships between dose and Cp(S) to estimate the clearance of S-warfarin, CL(S), and that between Cp(S) and the normal prothrombin concentration (NPT) as a coagulation marker for estimation of IC50. We also analyzed the non-linear relationship between NPT inhibition and the increase in INR to derive the non-linear index λ. Population analyses accurately predicted the time-courses of Cp(S), NPT and INR. Multivariate analysis showed that CYP2C9*3 mutation and body surface area were predictors of CL(S), that VKORC1 and CYP4F2 polymorphisms were predictors of IC50, and that baseline NPT was a predictor of λ. CL(S) and λ were significantly lower in patients with INR≥4 than in those with INR<4 (190 mL/h vs 265 mL/h, P<0.01 and 3.2 vs 3.7, P<0.01, respectively). Finally, logistic regression analysis revealed that CL(S), ALT and hypertension contributed significantly to INR≥4. All these results indicate that factors associated with the reduced metabolic activity of warfarin represented by CL(S), might be critical determinants of the over-anticoagulation response during warfarin initiation in Asians. ClinicalTrials.gov NCT02065388.

  11. Molecular- and cultivation-based analyses of microbial communities in oil field water and in microcosms amended with nitrate to control H{sub 2}S production

    Energy Technology Data Exchange (ETDEWEB)

    Kumaraswamy, Raji; Ebert, Sara; Fedorak, Phillip M.; Foght, Julia M. [Alberta Univ., Edmonton, AB (Canada). Biological Sciences; Gray, Murray R. [Alberta Univ., Edmonton, AB (Canada). Chemical and Materials Engineering

    2011-03-15

    Nitrate injection into oil fields is an alternative to biocide addition for controlling sulfide production ('souring') caused by sulfate-reducing bacteria (SRB). This study examined the suitability of several cultivation-dependent and cultivation-independent methods to assess potential microbial activities (sulfidogenesis and nitrate reduction) and the impact of nitrate amendment on oil field microbiota. Microcosms containing produced waters from two Western Canadian oil fields exhibited sulfidogenesis that was inhibited by nitrate amendment. Most probable number (MPN) and fluorescent in situ hybridization (FISH) analyses of uncultivated produced waters showed low cell numbers ({<=}10{sup 3} MPN/ml) dominated by SRB (>95% relative abundance). MPN analysis also detected nitrate-reducing sulfide-oxidizing bacteria (NRSOB) and heterotrophic nitrate-reducing bacteria (HNRB) at numbers too low to be detected by FISH or denaturing gradient gel electrophoresis (DGGE). In microcosms containing produced water fortified with sulfate, near-stoichiometric concentrations of sulfide were produced. FISH analyses of the microcosms after 55 days of incubation revealed that Gammaproteobacteria increased from undetectable levels to 5-20% abundance, resulting in a decreased proportion of Deltaproteobacteria (50-60% abundance). DGGE analysis confirmed the presence of Delta- and Gammaproteobacteria and also detected Bacteroidetes. When sulfate-fortified produced waters were amended with nitrate, sulfidogenesis was inhibited and Deltaproteobacteria decreased to levels undetectable by FISH, with a concomitant increase in Gammaproteobacteria from below detection to 50-60% abundance. DGGE analysis of these microcosms yielded sequences of Gamma- and Epsilonproteobacteria related to presumptive HNRB and NRSOB (Halomonas, Marinobacterium, Marinobacter, Pseudomonas and Arcobacter), thus supporting chemical data indicating that nitrate-reducing bacteria out-compete SRB when nitrate is

  12. Determinants of the over-anticoagulation response during warfarin initiation therapy in Asian patients based on population pharmacokinetic-pharmacodynamic analyses.

    Directory of Open Access Journals (Sweden)

    Minami Ohara

    Full Text Available To clarify pharmacokinetic-pharmacodynamic (PK-PD factors associated with the over-anticoagulation response in Asians during warfarin induction therapy, population PK-PD analyses were conducted in an attempt to predict the time-courses of the plasma S-warfarin concentration, Cp(S, and coagulation and anti-coagulation (INR responses. In 99 Chinese patients we analyzed the relationships between dose and Cp(S to estimate the clearance of S-warfarin, CL(S, and that between Cp(S and the normal prothrombin concentration (NPT as a coagulation marker for estimation of IC50. We also analyzed the non-linear relationship between NPT inhibition and the increase in INR to derive the non-linear index λ. Population analyses accurately predicted the time-courses of Cp(S, NPT and INR. Multivariate analysis showed that CYP2C9*3 mutation and body surface area were predictors of CL(S, that VKORC1 and CYP4F2 polymorphisms were predictors of IC50, and that baseline NPT was a predictor of λ. CL(S and λ were significantly lower in patients with INR≥4 than in those with INR<4 (190 mL/h vs 265 mL/h, P<0.01 and 3.2 vs 3.7, P<0.01, respectively. Finally, logistic regression analysis revealed that CL(S, ALT and hypertension contributed significantly to INR≥4. All these results indicate that factors associated with the reduced metabolic activity of warfarin represented by CL(S, might be critical determinants of the over-anticoagulation response during warfarin initiation in Asians.ClinicalTrials.gov NCT02065388.

  13. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  14. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  15. A STUDY TO ANALYSE THE EFFICACY OF MODIFIED PILATES BASED EXERCISES AND THERAPEUTIC EXERCISES IN INDIVIDUALS WITH CHRONIC NON SPECIFIC LOW BACK PAIN: A RANDOMIZED CONTROLLED TRAIL

    OpenAIRE

    U.Albert Anand,; P.Mariet Caroline,; B.Arun,; G.Lakshmi Gomathi

    2014-01-01

    Background: Chronic low back pain is an expensive and difficult condition to treat. Low back pain is the most common musculoskeletal symptoms seen in 85 % of individuals in their life time. One of the interventions widely used by physiotherapists in the treatment of chronic non-specific low back pain (CNLBP) is exercise therapy based upon the Pilates principles. Objective: The purpose of the study was to find out the effect of Modified Pilates based exercises for patients with ...

  16. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  17. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  18. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  19. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    International Nuclear Information System (INIS)

    Ando, Masami; Sunaguchi, Naoki; Wu, Yanlin; Do, Synho; Sung, Yongjin; Gupta, Rajiv; Louissaint, Abner; Yuasa, Tetsuya; Ichihara, Shu

    2014-01-01

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  20. CORSEN, a new software dedicated to microscope-based 3D distance measurements: mRNA-mitochondria distance, from single-cell to population analyses.

    Science.gov (United States)

    Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde

    2010-07-01

    Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.

  1. Comparison Based on Exergetic Analyses of Two Hot Air Engines: A Gamma Type Stirling Engine and an Open Joule Cycle Ericsson Engine

    Directory of Open Access Journals (Sweden)

    Houda Hachem

    2015-10-01

    Full Text Available In this paper, a comparison of exergetic models between two hot air engines (a Gamma type Stirling prototype having a maximum output mechanical power of 500 W and an Ericsson hot air engine with a maximum power of 300 W is made. Referring to previous energetic analyses, exergetic models are set up in order to quantify the exergy destruction and efficiencies in each type of engine. The repartition of the exergy fluxes in each part of the two engines are determined and represented in Sankey diagrams, using dimensionless exergy fluxes. The results show a similar proportion in both engines of destroyed exergy compared to the exergy flux from the hot source. The compression cylinders generate the highest exergy destruction, whereas the expansion cylinders generate the lowest one. The regenerator of the Stirling engine increases the exergy resource at the inlet of the expansion cylinder, which might be also set up in the Ericsson engine, using a preheater between the exhaust air and the compressed air transferred to the hot heat exchanger.

  2. Towards the development of clinical measures for spinal cord injury based on the International Classification of Functioning, Disability and Health with Rasch analyses.

    Science.gov (United States)

    Ballert, Carolina S; Stucki, Gerold; Biering-Sørensen, Fin; Cieza, Alarcos

    2014-09-01

    To determine whether the International Classification of Functioning, Disability and Health (ICF) categories relevant to spinal cord injury (SCI) can be integrated in clinical measures and to obtain insights to guide their future operationalization. Specific aims are to find out whether the ICF categories relevant to SCI fit a Rasch model taking into consideration the dimensionality found in previous investigations, local item dependencies, or differential item functioning. All second-level ICF categories collected in the Development of ICF Core Sets for SCI project in specialized centers within 15 countries from 2006 through 2008. Secondary data analysis. Adults (N=1048) with SCI from the early postacute and long-term living context. Not applicable. Two unidimensional Rasch analyses: one for the ICF categories from body functions and body structures components and another for the ICF categories from the activities and participation component. Results support good reliability and targeting of the ICF categories in both dimensions. In each dimension, few ICF categories were subject to misfit. Local item dependency was observed between ICF categories of the same chapters. Group effects for age and sex were observed only to a small extent. The validity of ICF categories to develop measures of functioning in SCI for clinical practice and research is to some extent supported. Model adjustments were suggested to further improve their operationalization and psychometrics. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  3. Static and free-vibration analyses of cracks in thin-shell structures based on an isogeometric-meshfree coupling approach

    Science.gov (United States)

    Nguyen-Thanh, Nhon; Li, Weidong; Zhou, Kun

    2018-03-01

    This paper develops a coupling approach which integrates the meshfree method and isogeometric analysis (IGA) for static and free-vibration analyses of cracks in thin-shell structures. In this approach, the domain surrounding the cracks is represented by the meshfree method while the rest domain is meshed by IGA. The present approach is capable of preserving geometry exactness and high continuity of IGA. The local refinement is achieved by adding the nodes along the background cells in the meshfree domain. Moreover, the equivalent domain integral technique for three-dimensional problems is derived from the additional Kirchhoff-Love theory to compute the J-integral for the thin-shell model. The proposed approach is able to address the problems involving through-the-thickness cracks without using additional rotational degrees of freedom, which facilitates the enrichment strategy for crack tips. The crack tip enrichment effects and the stress distribution and displacements around the crack tips are investigated. Free vibrations of cracks in thin shells are also analyzed. Numerical examples are presented to demonstrate the accuracy and computational efficiency of the coupling approach.

  4. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masami [RIST, Tokyo University of Science, Noda, Chiba (Japan); Sunaguchi, Naoki [Gunma University, Graduate School of Engineering, Kiryu, Gunma (Japan); Wu, Yanlin [The Graduate University for Advanced Studies, Department of Materials Structure Science, School of High Energy Accelerator Science, Tsukuba, Ibaraki (Japan); Do, Synho; Sung, Yongjin; Gupta, Rajiv [Massachusetts General Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States); Louissaint, Abner [Massachusetts General Hospital and Harvard Medical School, Department of Pathology, Boston, MA (United States); Yuasa, Tetsuya [Yamagata University, Faculty of Engineering, Yonezawa, Yamagata (Japan); Ichihara, Shu [Nagoya Medical Center, Department of Pathology, Nagoya, Aichi (Japan)

    2014-02-15

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  5. Peak-flow frequency analyses and results based on data through water year 2011 for selected streamflow-gaging stations in or near Montana: Chapter C in Montana StreamStats

    Science.gov (United States)

    Sando, Steven K.; McCarthy, Peter M.; Dutton, DeAnn M.

    2016-04-05

    Chapter C of this Scientific Investigations Report documents results from a study by the U.S. Geological Survey, in cooperation with the Montana Department of Transportation and the Montana Department of Natural Resources, to provide an update of statewide peak-flow frequency analyses and results for Montana. The purpose of this report chapter is to present peak-flow frequency analyses and results for 725 streamflow-gaging stations in or near Montana based on data through water year 2011. The 725 streamflow-gaging stations included in this study represent nearly all streamflowgaging stations in Montana (plus some from adjacent states or Canadian Provinces) that have at least 10 years of peak-flow records through water year 2011. For 29 of the 725 streamflow-gaging stations, peak-flow frequency analyses and results are reported for both unregulated and regulated conditions. Thus, peak-flow frequency analyses and results are reported for a total of 754 analyses. Estimates of peak-flow magnitudes for 66.7-, 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities are reported. These annual exceedance probabilities correspond to 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals.

  6. Spectroscopic analyses on interaction of Amantadine-Salicylaldehyde, Amantadine-5-Chloro-Salicylaldehyde and Amantadine-o-Vanillin Schiff-Bases with bovine serum albumin (BSA)

    Science.gov (United States)

    Wang, Zhiqiu; Gao, Jingqun; Wang, Jun; Jin, Xudong; Zou, Mingming; Li, Kai; Kang, Pingli

    2011-12-01

    In this work, three Tricyclo [3.3.1.1(3,7)] decane-1-amine (Amantadine) Schiff-Bases, Amantadine-Salicylaldehyde (AS), Amantadine-5-Chloro-Salicylaldehyde (AS-5-C) and Amantadine-o-Vanillin (AS-o-V), were synthesized by direct heating reflux method in ethanol solution and characterized by infrared spectrum and elementary analysis. Fluorescence quenching was used to study the interaction of these Amantadine Schiff-Bases (AS, AS-5-C and AS-o-V) with bovine serum albumin (BSA). According to fluorescence quenching calculations the bimolecular quenching constant ( Kq), apparent quenching constant ( KSV), effective binding constant ( KA) and corresponding dissociation constant ( KD), binding site number ( n) and binding distance ( r) were obtained. The results show that these Amantadine Schiff-Bases can obviously bind to BSA molecules and the binding strength order is AS < AS-5-C = AS-o-V. Synchronous fluorescence spectroscopy reveals that these Amantadine Schiff-Bases adopt different way to bind with BSA molecules. That is, the AS and AS-5-C are accessibility to tryptophan (Trp) residues more than the tyrosine (Tyr) residues, while the AS-o-V is equally close to the Tyr and Trp residues.

  7. Regional variation of flow duration curves in the eastern United States: Process-based analyses of the interaction between climate and landscape properties

    Science.gov (United States)

    Wafa Chouaib; Peter V. Caldwell; Younes Alila

    2018-01-01

    This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the...

  8. Examining the Impact of a Video Case-Based Mathematics Methods Course on Secondary Pre-Service Teachers' Skills at Analysing Students' Strategies

    Science.gov (United States)

    Martinez, Mara Vanina; Superfine, Alison Castro; Carlton, Theresa; Dasgupta, Chandan

    2015-01-01

    This paper focuses on results from a study conducted with two cohorts of pre-service teachers (PSTs) in a video case-based mathematics methods course at a large Midwestern university in the US. The motivation for this study was to look beyond whether or not PSTs pay attention to mathematical thinking of students, as shown by previous studies when…

  9. Deformation Analyses and Lithologic Characterization in Overpressured Basins Based on Logging While Drilling and Wireline Results from the Gulf of Mexico

    Science.gov (United States)

    Iturrino, G. J.; Pirmez, C.; Moore, J. C.; Reichow, M. K.; Dugan, B. E.; Sawyer, D. E.; Flemings, P. B.; Shipboard Scientific Party, I.

    2005-12-01

    IODP Expedition 308 drilled transects along the Brazos-Trinity IV and Ursa Basins in the western and eastern Gulf of Mexico, respectively, for examining how sedimentation, overpressure, fluid flow, and deformation are coupled in passive margin settings. A total of eight holes were logged using either logging while drilling (LWD) or wireline techniques to evaluate the controls on slope stability, understand the timing of sedimentation and slumping, establish the petrophysical properties of shallow sediments, and provide a better understanding of turbidite systems. Overall, the log responses vary for the different lithostratigraphic units and associated regional seismic reflectors. The data acquired also make bed-to-bed correlation between sites possible, which is valuable for the study of sandy turbidites and studies of regional deformation. The thick sedimentary successions drilled at these basins show records of the evolution of channel-levee systems composed of low relief channels that were incapable of confining the turbidity currents causing an overspill of sand and silt. In addition, mass transport deposits at shallow depths, and transitions between interbedded silt, sand, and mud units are common features identified in many of the downhole logging data. In the Ursa Basin sediments, resistivity-at-the-bit images show significant deformation of the overlying hemipelagic drape and distal turbidites that were drilled in these areas. Numerous dipping beds throughout these intervals with dips ranging from 5 to 55 degrees confirm core observations. Steeply deformed beds, with dips as high as 65 degrees, and folded and faulted beds suggest down slope remobilization as mass-transport deposits. Resistivity images also show evidence of these mass-transport deposits where steep dips and folds suggest the presence of overturned beds within a series of cyclic intervals that we interpret as a succession of sand-silt-mud lamina. Preliminary structural analyses suggest that

  10. Insights into hydrologic and hydrochemical processes based on concentration-discharge and end-member mixing analyses in the mid-Merced River Basin, Sierra Nevada, California

    Science.gov (United States)

    Liu, Fengjing; Conklin, Martha H.; Shaw, Glenn D.

    2017-01-01

    Both concentration-discharge relation and end-member mixing analysis were explored to elucidate the connectivity of hydrologic and hydrochemical processes using chemical data collected during 2006-2008 at Happy Isles (468 km2), Pohono Bridge (833 km2), and Briceburg (1873 km2) in the snowmelt-fed mid-Merced River basin, augmented by chemical data collected by the USGS during 1990-2014 at Happy Isles. Concentration-discharge (C-Q) in streamflow was dominated by a well-defined power law relation, with the magnitude of exponent (0.02-0.6) and R2 values (p lower on rising than falling limbs. Concentrations of conservative solutes in streamflow resulted from mixing of two end-members at Happy Isles and Pohono Bridge and three at Briceburg, with relatively constant solute concentrations in end-members. The fractional contribution of groundwater was higher on rising than falling limbs at all basin scales. The relationship between the fractional contributions of subsurface flow and groundwater and streamflow (F-Q) followed the same relation as C-Q as a result of end-member mixing. The F-Q relation was used as a simple model to simulate subsurface flow and groundwater discharges to Happy Isles from 1990 to 2014 and was successfully validated by solute concentrations measured by the USGS. It was also demonstrated that the consistency of F-Q and C-Q relations is applicable to other catchments where end-members and the C-Q relationships are well defined, suggesting hydrologic and hydrochemical processes are strongly coupled and mutually predictable. Combining concentration-discharge and end-member mixing analyses could be used as a diagnostic tool to understand streamflow generation and hydrochemical controls in catchment hydrologic studies.

  11. Biochemical and genetic analyses of the oomycete Pythium insidiosum provide new insights into clinical identification and urease-based evolution of metabolism-related traits

    Directory of Open Access Journals (Sweden)

    Theerapong Krajaejun

    2018-06-01

    Full Text Available The oomycete microorganism, Pythium insidiosum, causes the life-threatening infectious condition, pythiosis, in humans and animals worldwide. Affected individuals typically endure surgical removal of the infected organ(s. Detection of P. insidiosum by the established microbiological, immunological, or molecular methods is not feasible in non-reference laboratories, resulting in delayed diagnosis. Biochemical assays have been used to characterize P. insidiosum, some of which could aid in the clinical identification of this organism. Although hydrolysis of maltose and sucrose has been proposed as the key biochemical feature useful in discriminating P. insidiosum from other oomycetes and fungi, this technique requires a more rigorous evaluation involving a wider selection of P. insidiosum strains. Here, we evaluated 10 routinely available biochemical assays for characterization of 26 P. insidiosum strains, isolated from different hosts and geographic origins. Initial assessment revealed diverse biochemical characteristics across the P. insidiosum strains tested. Failure to hydrolyze sugars is observed, especially in slow-growing strains. Because hydrolysis of maltose and sucrose varied among different strains, use of the biochemical assays for identification of P. insidiosum should be cautioned. The ability of P. insidiosum to hydrolyze urea is our focus, because this metabolic process relies on the enzyme urease, an important virulence factor of other pathogens. The ability to hydrolyze urea varied among P. insidiosum strains and was not associated with growth rates. Genome analyses demonstrated that urease- and urease accessory protein-encoding genes are present in both urea-hydrolyzing and non-urea-hydrolyzing strains of P. insidiosum. Urease genes are phylogenetically conserved in P. insidiosum and related oomycetes, while the presence of urease accessory protein-encoding genes is markedly diverse in these organisms. In summary, we dissected

  12. Taxonomic revision of Gelidium tsengii and Gelidium honghaiwanense sp. nov. (Gelidiales, Rhodophyta) from China based upon molecular and morphological data analyses

    Science.gov (United States)

    Wang, Xulei; Xia, Bangmei; Bottalico, Antonella; Wang, Guangce

    2017-11-01

    The taxonomic relationship of Chinese Gelidium tsengii and Gelidium johnstonii was ambiguous. For almost 20 years they have been regarded as distinct taxa and until 2002 G. johnstonii was considered as a misapplied name of G. tsengii. In this study, herbarium specimens that initially attributed to G. tsengii and fresh G. tsengii specimens were used to address the taxonomic issues. In phylogenetic studies, G. tsengii from Dayawan, China, near the type locality of G. tsengii and G. johnstonii from Sonora, Mexico, the type locality of G. johnstonii, formed a monophyletic group with maximum support in rbcL and COI genes analyses, indicating that they were genetically identical. In morphological studies, G. tsengii was similar to G. johnstonii in branching pattern, inner structures and fructiferous organs. Consequently, we considered that semi-circular outline of G. tsengii could no longer be treated as a discrimating feature. G. johnstonii had priority of publication and according to the International Code of Botanical Nomenclature, G. tsengii was proposed as a synonym of G. johnstonii. Gelidium honghaiwanense sp. nov. was described from Guangdong, China on the basis of morphological and molecular data. For vegetative structures, it was characterized by flattened upright frond, regular two-three times branches pinnate or alternate and clavate ultimate branchlets. For reproductive structures, the tetrasporangial sori were in the apical part of branches and the tetrasporangial branchlets were distichously distributed along second order branches. The present study clarified the relationship between G. tsengii and G. johnstonii from Guangdong and added a new Gelidium species to the Chinese algal flora.

  13. Analysing the Concepts of Vengeance and Hono(ur in Shakespeare´s Hamlet and Sumarokov´s Gamlet: A Corpus-based Approach to Literature

    Directory of Open Access Journals (Sweden)

    Irina Keshabyan

    2009-12-01

    Full Text Available The present paper aims at carrying out structural and lexical analysis of two contrasting plays –Shakespeare´s Hamlet and Sumarokov´s Gamlet- in a specific linguistic domain. In this contribution, we will attempt to gain some insight into two essential content words: vengeance and hono(ur, their derivatives and related words, through quantitative analysis of these words and qualitative analysis of their collocates and concordances. Collocational approach will be used to analyse and compare the ways the authors perceive the concepts of vengeance and hono(ur. In general, the findings will indicate important similarities and/or differences between the structures of the plays per acts and both texts´ basic contents in relation to two important topics -vengeance and hono(ur.El presente artículo tiene como objetivo un análisis estructural y léxico de dos obras contrastivas –Hamlet de Shakespeare y Gamlet de Sumarokov- en un dominio lingüístico específico. En esta contribución, intentaremos adentrarse en el estudio de dos sustantivos: venganza y honor, sus derivados y palabras relacionadas- a través de un análisis cuantitativo de las mismas y el análisis cualitativo de sus colocados y concordancias. El método de los colocados será utilizado para analizar y comparar el modo en que los autores perciben los conceptos de venganza y honor. En general, los resultados van a señalar las similitudes y/o diferencias importantes entre las estructuras de las obras por actos y los contenidos básicos de ambos textos en relación con dos temas importantes, tal como, venganza y honor.

  14. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses

    Directory of Open Access Journals (Sweden)

    Jason L. Brown

    2017-12-01

    Full Text Available SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism, generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  15. Voxel-based statistical analysis of cerebral blood flow using Tc-99m ECD brain SPECT in patients with traumatic brain injury: group and individual analyses.

    Science.gov (United States)

    Shin, Yong Beom; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Jae Heung; Yeom, Seok-Ran

    2006-06-01

    Statistical parametric mapping (SPM) was applied to brain perfusion single photon emission computed tomography (SPECT) images in patients with traumatic brain injury (TBI) to investigate regional cerebral abnormalities compared to age-matched normal controls. Thirteen patients with TBI underwent brain perfusion SPECT were included in this study (10 males, three females, mean age 39.8 +/- 18.2, range 21 - 74). SPM2 software implemented in MATLAB 5.3 was used for spatial pre-processing and analysis and to determine the quantitative differences between TBI patients and age-matched normal controls. Three large voxel clusters of significantly decreased cerebral blood perfusion were found in patients with TBI. The largest clusters were area including medial frontal gyrus (voxel number 3642, peak Z-value = 4.31, 4.27, p = 0.000) in both hemispheres. The second largest clusters were areas including cingulated gyrus and anterior cingulate gyrus of left hemisphere (voxel number 381, peak Z-value = 3.67, 3.62, p = 0.000). Other clusters were parahippocampal gyrus (voxel number 173, peak Z-value = 3.40, p = 0.000) and hippocampus (voxel number 173, peak Z-value = 3.23, p = 0.001) in the left hemisphere. The false discovery rate (FDR) was less than 0.04. From this study, group and individual analyses of SPM2 could clearly identify the perfusion abnormalities of brain SPECT in patients with TBI. Group analysis of SPM2 showed hypoperfusion pattern in the areas including medial frontal gyrus of both hemispheres, cingulate gyrus, anterior cingulate gyrus, parahippocampal gyrus and hippocampus in the left hemisphere compared to age-matched normal controls. Also, left parahippocampal gyrus and left hippocampus were additional hypoperfusion areas. However, these findings deserve further investigation on a larger number of patients to be performed to allow a better validation of objective SPM analysis in patients with TBI.

  16. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses.

    Science.gov (United States)

    Brown, Jason L; Bennett, Joseph R; French, Connor M

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  17. BiodivERsA project VineDivers: Analysing interlinkages between soil biota and biodiversity-based ecosystem services in vineyards across Europe

    Science.gov (United States)

    Zaller, Johann G.; Winter, Silvia; Strauss, Peter; Querner, Pascal; Kriechbaum, Monika; Pachinger, Bärbel; Gómez, José A.; Campos, Mercedes; Landa, Blanca; Popescu, Daniela; Comsa, Maria; Iliescu, Maria; Tomoiaga, Liliana; Bunea, Claudiu-Ioan; Hoble, Adela; Marghitas, Liviu; Rusu, Teodor; Lora, Ángel; Guzmán, Gema; Bergmann, Holger

    2015-04-01

    Essential ecosystem services provided by viticultural landscapes result from diverse communities of above- and belowground organisms and their interactions. For centuries traditional viticulture was part of a multifunctional agricultural system including low-input grasslands and fruit trees resulting in a high functional biodiversity. However, in the last decades intensification and mechanisation of vineyard management caused a separation of production and conservation areas. As a result of management intensification including frequent tilling and/or use of pesticides several ecosystem services are affected leading to high rates of soil erosion, degradation of soil structure and fertility, contamination of groundwater and high levels of agricultural inputs. In this transdisciplinary BiodivERsA project we will examine to what extent differently intensive managed vineyards affect the activity and diversity of soil biota (e.g. earthworms, collembola, soil microorganisms) and how this feed back on aboveground biodiversity (e.g. weeds, pollinators). We will also investigate ecosystem services associated with soil faunal activity and biodiversity such as soil structure, the formation of stable soil aggregates, water infiltration, soil erosion as well as grape quality. These effects will become increasingly important as more extreme precipitation events are predicted with climate change. The socio-economic part of the project will investigate the role of diversely structured, species-rich viticultural landscapes as a cultural heritage providing aesthetic values for human well-being and recreation. The project objectives will be analysed at plot, field (vineyard) and landscape scales in vineyards located in Spain, France, Romania and Austria. A detailed engagement and dissemination plan for stakeholder at the different governance levels will accompany scientific research and will contribute to the implementation of best-practice recommendations for policy and farmers.

  18. Fungal and Prokaryotic Activities in the Marine Subsurface Biosphere at Peru Margin and Canterbury Basin Inferred from RNA-Based Analyses and Microscopy.

    Science.gov (United States)

    Pachiadaki, Maria G; Rédou, Vanessa; Beaudoin, David J; Burgaud, Gaëtan; Edgcomb, Virginia P

    2016-01-01

    The deep sedimentary biosphere, extending 100s of meters below the seafloor harbors unexpected diversity of Bacteria, Archaea, and microbial eukaryotes. Far less is known about microbial eukaryotes in subsurface habitats, albeit several studies have indicated that fungi dominate microbial eukaryotic communities and fungal molecular signatures (of both yeasts and filamentous forms) have been detected in samples as deep as 1740 mbsf. Here, we compare and contrast fungal ribosomal RNA gene signatures and whole community metatranscriptomes present in sediment core samples from 6 and 95 mbsf from Peru Margin site 1229A and from samples from 12 and 345 mbsf from Canterbury Basin site U1352. The metatranscriptome analyses reveal higher relative expression of amino acid and peptide transporters in the less nutrient rich Canterbury Basin sediments compared to the nutrient rich Peru Margin, and higher expression of motility genes in the Peru Margin samples. Higher expression of genes associated with metals transporters and antibiotic resistance and production was detected in Canterbury Basin sediments. A poly-A focused metatranscriptome produced for the Canterbury Basin sample from 345 mbsf provides further evidence for active fungal communities in the subsurface in the form of fungal-associated transcripts for metabolic and cellular processes, cell and membrane functions, and catalytic activities. Fungal communities at comparable depths at the two geographically separated locations appear dominated by distinct taxa. Differences in taxonomic composition and expression of genes associated with particular metabolic activities may be a function of sediment organic content as well as oceanic province. Microscopic analysis of Canterbury Basin sediment samples from 4 and 403 mbsf produced visualizations of septate fungal filaments, branching fungi, conidiogenesis, and spores. These images provide another important line of evidence supporting the occurrence and activity of fungi in