WorldWideScience

Sample records for analyser based spectromicroscope

  1. Spectromicroscopic insights for rational design of redox-based memristive devices

    Science.gov (United States)

    Baeumer, Christoph; Schmitz, Christoph; Ramadan, Amr H. H.; Du, Hongchu; Skaja, Katharina; Feyer, Vitaliy; Müller, Philipp; Arndt, Benedikt; Jia, Chun-Lin; Mayer, Joachim; de Souza, Roger A.; Michael Schneider, Claus; Waser, Rainer; Dittmann, Regina

    2015-10-01

    The demand for highly scalable, low-power devices for data storage and logic operations is strongly stimulating research into resistive switching as a novel concept for future non-volatile memory devices. To meet technological requirements, it is imperative to have a set of material design rules based on fundamental material physics, but deriving such rules is proving challenging. Here, we elucidate both switching mechanism and failure mechanism in the valence-change model material SrTiO3, and on this basis we derive a design rule for failure-resistant devices. Spectromicroscopy reveals that the resistance change during device operation and failure is indeed caused by nanoscale oxygen migration resulting in localized valence changes between Ti4+ and Ti3+. While fast reoxidation typically results in retention failure in SrTiO3, local phase separation within the switching filament stabilizes the retention. Mimicking this phase separation by intentionally introducing retention-stabilization layers with slow oxygen transport improves retention times considerably.

  2. A flange on electron spectromicroscope with spherical deflector analyzer—simultaneous imaging of reciprocal and real spaces

    International Nuclear Information System (INIS)

    An instrumental realization of the idea for the electron emission spectromicroscope based on the newly developed imaging energy filter called α–SDA (Spherical Deflector Analyzer) is reported. Its compact design enables the realization of the flange-on spectromicroscope concept. It is equipped with two independent energy selective imaging channels: one for real and another for reciprocal space visualization. These images can be acquired quasi-simultaneousely by means of the software based on the switching on and off potentials of the energy filter. An electron gun located inside the immersion objective lens allows a new kind of sample illumination by high energy primary electrons and thus, opens a new application field for electron spectromicroscopy under laboratory conditions. - Highlights: ► A novel flange-on electron emission spectromicroscope has been developed. ► It allows quasi-simultaneous observation of real and reciprocal images at two screens. ► It utilizes a new technique for sample illumination with high energy electrons

  3. Spectromicroscopic characterisation of the formation of complex interfaces

    OpenAIRE

    Maier, Florian C.

    2011-01-01

    Within the framework of this thesis the mechanisms of growth and reorganisation of surfaces within the first few layers were investigated that are the basis for the fabrication of high quality thin films and interfaces. Two model systems, PTCDA/Ag(111) and CdSe/ZnSe quantum dots (QD), were chosen to study such processes in detail and to demonstrate the power and improvements of the aberration corrected spectromicroscope SMART [1] simultaneously. The measurements benefit especially from the en...

  4. Spectro-microscopic measurements of carbonaceous aerosol aging in Central California

    Directory of Open Access Journals (Sweden)

    R. C. Moffet

    2013-04-01

    Full Text Available Carbonaceous aerosols are responsible for large uncertainties in climate models, degraded visibility, and adverse health effects. The Carbonaceous Aerosols and Radiative Effects Study (CARES was designed to study carbonaceous aerosols in the natural environment of Central Valley, California, and learn more about their atmospheric formation and aging. This paper presents results from spectro-microscopic measurements of carbonaceous particles collected during CARES at the time of pollution accumulation event (27–29 June 2010, when in situ measurements indicated an increase in the organic carbon content of aerosols as the Sacramento urban plume aged. Computer controlled scanning electron microscopy coupled with an energy dispersive X-ray detector (CCSEM/EDX and scanning transmission X-ray microscopy coupled with near edge X-ray absorption spectroscopy (STXM/NEXAFS were used to probe the chemical composition and morphology of individual particles. It was found that the mass of organic carbon on individual particles increased through condensation of secondary organic aerosol. STXM/NEXAFS indicated that the number fraction of homogenous organic particles lacking inorganic inclusions (greater than ~50 nm diameter increased with plume age as did the organic mass per particle. Comparison of the CARES spectro-microscopic data set with a similar dataset obtained in Mexico City during the MILAGRO campaign showed that individual particles in Mexico City contained twice as much carbon as those sampled during CARES. The number fraction of soot particles at the Mexico City urban site (30% was larger than at the CARES urban site (10% and the most aged samples from CARES contained less carbon-carbon double bonds. Differences between carbonaceous particles in Mexico City and California result from different sources, photochemical conditions, gas phase reactants, and secondary organic aerosol precursors. The detailed results provided by these spectro-microscopic

  5. Spectro-microscopic measurements of carbonaceous aerosol aging in Central California

    Directory of Open Access Journals (Sweden)

    R. C. Moffet

    2013-10-01

    Full Text Available Carbonaceous aerosols are responsible for large uncertainties in climate models, degraded visibility, and adverse health effects. The Carbonaceous Aerosols and Radiative Effects Study (CARES was designed to study carbonaceous aerosols in the natural environment of the Central Valley, California, and learn more about their atmospheric formation and aging. This paper presents results from spectro-microscopic measurements of carbonaceous particles collected during CARES at the time of a pollution accumulation event (27–29 June 2010, when in situ measurements indicated an increase in the organic carbon content of aerosols as the Sacramento urban plume aged. Computer-controlled scanning electron microscopy coupled with an energy dispersive X-ray detector (CCSEM/EDX and scanning transmission X-ray microscopy coupled with near-edge X-ray absorption spectroscopy (STXM/NEXAFS were used to probe the chemical composition and morphology of individual particles. It was found that the mass of organic carbon on individual particles increased through condensation of secondary organic aerosol. STXM/NEXAFS indicated that the number fraction of homogenous organic particles lacking inorganic inclusions (greater than ~50 nm equivalent circular diameter increased with plume age, as did the organic mass per particle. Comparison of the CARES spectro-microscopic dataset with a similar dataset obtained in Mexico City during the MILAGRO campaign showed that fresh particles in Mexico City contained three times as much carbon as those sampled during CARES. The number fraction of soot particles at the Mexico City urban site (ranging from 16.6 to 47.3% was larger than at the CARES urban site (13.4–15.7%, and the most aged samples from CARES contained fewer carbon–carbon double bonds. Differences between carbonaceous particles in Mexico City and California result from different sources, photochemical conditions, gas phase reactants, and secondary organic aerosol

  6. Masonry: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the masonry program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses Masonry…

  7. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…

  8. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  9. Training Residential Staff to Conduct Trial-Based Functional Analyses

    Science.gov (United States)

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  10. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.)

  11. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  12. The Increasing Importance of Gene-Based Analyses

    Science.gov (United States)

    Cirulli, Elizabeth T.

    2016-01-01

    In recent years, genome and exome sequencing studies have implicated a plethora of new disease genes with rare causal variants. Here, I review 150 exome sequencing studies that claim to have discovered that a disease can be caused by different rare variants in the same gene, and I determine whether their methods followed the current best-practice guidelines in the interpretation of their data. Specifically, I assess whether studies appropriately assess controls for rare variants throughout the entire gene or implicated region as opposed to only investigating the specific rare variants identified in the cases, and I assess whether studies present sufficient co-segregation data for statistically significant linkage. I find that the proportion of studies performing gene-based analyses has increased with time, but that even in 2015 fewer than 40% of the reviewed studies used this method, and only 10% presented statistically significant co-segregation data. Furthermore, I find that the genes reported in these papers are explaining a decreasing proportion of cases as the field moves past most of the low-hanging fruit, with 50% of the genes from studies in 2014 and 2015 having variants in fewer than 5% of cases. As more studies focus on genes explaining relatively few cases, the importance of performing appropriate gene-based analyses is increasing. It is becoming increasingly important for journal editors and reviewers to require stringent gene-based evidence to avoid an avalanche of misleading disease gene discovery papers. PMID:27055023

  13. Microstructural and compositional analyses of GaN-based nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Pretorius, Angelika; Mueller, Knut; Rosenauer, Andreas [Section Electron Microscopy, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Schmidt, Thomas; Falta, Jens [Section Surface Physics, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Aschenbrenner, Timo; Yamaguchi, Tomohiro; Dartsch, Heiko; Hommel, Detlef [Section Semiconductor Epitaxy, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Kuebel, Christian [Institute of Nanotechnology, Karlsruher Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-08-15

    Composition and microstructure of GaN-based island structures and distributed Bragg reflectors (DBRs) were investigated with transmission electron microscopy (TEM). We analysed free-standing InGaN islands and islands capped with GaN. Growth of the islands performed by molecular beam epitaxy (MBE) and metal organic vapour phase epitaxy (MOVPE) resulted in different microstructures. The islands grown by MBE were plastically relaxed. Cap layer deposition resulted in a rapid dissolution of the islands already at early stages of cap layer growth. These findings are confirmed by grazing-incidence X-ray diffraction (GIXRD). In contrast, the islands grown by MOVPE relax only elastically. Strain state analysis (SSA) revealed that the indium concentration increases towards the tips of the islands. For an application as quantum dots, the islands must be embedded into DBRs. Structure and composition of Al{sub y}Ga{sub 1-y}N/GaN Bragg reflectors on top of an AlGaN buffer layer and In{sub x}Al{sub 1-x}N/GaN Bragg reflectors on top of a GaN buffer layer were investigated. Specifically, structural defects such as threading dislocations (TDs) and inversion domains (IDs) were studied, and we investigated thicknesses, interfaces and interface roughnesses of the layers. As the peak reflectivities of the investigated DBRs do not reach the theoretical predictions, possible reasons are discussed. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Analyser-based x-ray imaging for biomedical research

    International Nuclear Information System (INIS)

    Analyser-based imaging (ABI) is one of the several phase-contrast x-ray imaging techniques being pursued at synchrotron radiation facilities. With advancements in compact source technology, there is a possibility that ABI will become a clinical imaging modality. This paper presents the history of ABI as it has developed from its laboratory source to synchrotron imaging. The fundamental physics of phase-contrast imaging is presented both in a general sense and specifically for ABI. The technology is dependent on the use of perfect crystal monochromator optics. The theory of the x-ray optics is developed and presented in a way that will allow optimization of the imaging for specific biomedical systems. The advancement of analytical algorithms to produce separate images of the sample absorption, refraction angle map and small-angle x-ray scattering is detailed. Several detailed applications to biomedical imaging are presented to illustrate the broad range of systems and body sites studied preclinically to date: breast, cartilage and bone, soft tissue and organs. Ultimately, the application of ABI in clinical imaging will depend partly on the availability of compact sources with sufficient x-ray intensity comparable with that of the current synchrotron environment. (paper)

  15. Analysing Janusz Kaminski's cinematography based on selected movies

    OpenAIRE

    Rumanóczki, Péter

    2015-01-01

    The aim of this thesis was to analyse the artistic and technical points of the work of Polish cinematographer Janusz Kaminski. Three movies were selected from his filmography, all of them were directed by Steven Spielberg. The movies were analysed with the intention to find the trademarks thatKaminski used during his career. In addition, the technical details of making movies, what does a crew do and what equipment they use, were studied.

  16. Array-based GNSS Ionospheric Sensing: Estimability and Precision Analyses

    Science.gov (United States)

    Teunissen, Peter

    2016-04-01

    Array-based GNSS Ionospheric Sensing: Estimability and Precision Analyses PJG Teunissen1,2, A Khodabandeh1 and B Zhang1 1GNSS Research Centre, Curtin University, Perth, Australia 2Geoscience and Remote Sensing, Delft University of Technology, The Netherlands Introduction: The Global Navigation Satellite Systems (GNSS) have proved to be an effective means of measuring the Earth's ionosphere. The well-known geometry-free linear combinations of the GNSS data serve as the input of an external ionospheric model to capture both the spatial and temporal characteristics of the ionosphere. Next to the slant ionospheric delays experienced by the GNSS antennas, the geometry-free combinations also contain additional unknown delays that are caused by the presence of the carrier-phase ambiguous cycles and/or the code instrumental delays. That the geometry-free combinations, without an external ionospheric model, cannot unbiasedly determine the slant ionospheric delays reveals the lack of information content in the GNSS data. Motivation and objectives: With the advent of modernized multi-frequency signals, one is confronted with many different combinations of the GNSS data that are capable of sensing the ionosphere. Owing to such diversity and the lack of information content in the GNSS data, various estimable ionospheric delays of different interpretations (and of different precision) can therefore be formed. How such estimable ionospheric delays should be interpreted and the extent to which they contribute to the precision of the unbiased slant ionosphere are the topics of this contribution. Approach and results: In this contribution, we apply S-system theory to study the estimability and precision of the estimable slant ionospheric delays that are measured by the multi-frequency GNSS data. Two different S-systems are presented, leading to two different estimable parameters of different precision: 1) the phase-driven ionospheric delays and 2) the code-driven ionospheric delays

  17. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  18. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  19. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming;

    2010-01-01

    conditions and finding useful additives to hinder the color to change rapidly. To be able to prove which methods of storing and additives work, Danisco wants to monitor the development of the color of meat in a slice of ham as a function of time, environment and ingredients. We have chosen to use multi......Every consumer wants fresh ham and the way we decide whether the meat is fresh or not is by looking at the color. The producers of ham wants a long shelf life, meaning they want the ham to look fresh for a long time. The Danish company Danisco is therefore trying to develop optimal storing...... methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...

  20. Exploring clinical associations using '-omics' based enrichment analyses.

    Directory of Open Access Journals (Sweden)

    David A Hanauer

    Full Text Available BACKGROUND: The vast amounts of clinical data collected in electronic health records (EHR is analogous to the data explosion from the "-omics" revolution. In the EHR clinicians often maintain patient-specific problem summary lists which are used to provide a concise overview of significant medical diagnoses. We hypothesized that by tapping into the collective wisdom generated by hundreds of physicians entering problems into the EHR we could detect significant associations among diagnoses that are not described in the literature. METHODOLOGY/PRINCIPAL FINDINGS: We employed an analytic approach original developed for detecting associations between sets of gene expression data, called Molecular Concept Map (MCM, to find significant associations among the 1.5 million clinical problem summary list entries in 327,000 patients from our institution's EHR. An odds ratio (OR and p-value was calculated for each association. A subset of the 750,000 associations found were explored using the MCM tool. Expected associations were confirmed and recently reported but poorly known associations were uncovered. Novel associations which may warrant further exploration were also found. Examples of expected associations included non-insulin dependent diabetes mellitus and various diagnoses such as retinopathy, hypertension, and coronary artery disease. A recently reported association included irritable bowel and vulvodynia (OR 2.9, p = 5.6x10(-4. Associations that are currently unknown or very poorly known included those between granuloma annulare and osteoarthritis (OR 4.3, p = 1.1x10(-4 and pyloric stenosis and ventricular septal defect (OR 12.1, p = 2.0x10(-3. CONCLUSIONS/SIGNIFICANCE: Computer programs developed for analyses of "-omic" data can be successfully applied to the area of clinical medicine. The results of the analysis may be useful for hypothesis generation as well as supporting clinical care by reminding clinicians of likely problems associated with a

  1. Design and Analyses of a MEMS Based Resonant Magnetometer.

    Science.gov (United States)

    Ren, Dahai; Wu, Lingqi; Yan, Meizhi; Cui, Mingyang; You, Zheng; Hu, Muzhi

    2009-01-01

    A novel design of a MEMS torsional resonant magnetometer based on Lorentz force is presented and fabricated. The magnetometer consists of a silicon resonator, torsional beam, excitation coil, capacitance plates and glass substrate. Working in a resonant condition, the sensor's vibration amplitude is converted into the sensing capacitance change, which reflects the outside magnetic flux-density. Based on the simulation, the key structure parameters are optimized and the air damping effect is estimated. The test results of the prototype are in accordance with the simulation results of the designed model. The resolution of the magnetometer can reach 30 nT. The test results indicate its sensitivity of more than 400 mV/μT when operating in a 10 Pa vacuum environment. PMID:22399981

  2. Design and Analyses of a MEMS Based Resonant Magnetometer

    Directory of Open Access Journals (Sweden)

    Dahai Ren

    2009-09-01

    Full Text Available A novel design of a MEMS torsional resonant magnetometer based on Lorentz force is presented and fabricated. The magnetometer consists of a silicon resonator, torsional beam, excitation coil, capacitance plates and glass substrate. Working in a resonant condition, the sensor’s vibration amplitude is converted into the sensing capacitance change, which reflects the outside magnetic flux-density. Based on the simulation, the key structure parameters are optimized and the air damping effect is estimated. The test results of the prototype are in accordance with the simulation results of the designed model. The resolution of the magnetometer can reach 30 nT. The test results indicate its sensitivity of more than 400 mV/μT when operating in a 10 Pa vacuum environment.

  3. Mass spectrometry-based proteomic analyses of contact lens deposition

    OpenAIRE

    Green-Church, Kari B.; Nichols, Jason J.

    2008-01-01

    Purpose The purpose of this report is to describe the contact lens deposition proteome associated with two silicone hydrogel contact lenses and care solutions using a mass spectrometric-based approach. Methods This was a randomized, controlled, examiner-masked crossover clinical trial that included 48 participants. Lenses and no-rub care solutions evaluated included galyfilcon A (Acuvue Advance, Vistakon Inc., Jacksonville, FL), lotrafilcon B (O2 Optix, CIBA Vision Inc., Duluth, GA), AQuify (...

  4. Phylogenomic analyses of bat subordinal relationships based on transcriptome data.

    Science.gov (United States)

    Lei, Ming; Dong, Dong

    2016-01-01

    Bats, order Chiroptera, are one of the largest monophyletic clades in mammals. Based on morphology and behaviour bats were once differentiated into two suborders Megachiroptera and Microchiroptera Recently, researchers proposed alternative views of chiropteran classification (suborders Yinpterochiroptera and Yangochiroptera) based on morphological, molecular and fossil evidence. Since genome-scale data can significantly increase the number of informative characters for analysis, transcriptome RNA-seq data for 12 bat taxa were generated in an attempt to resolve bat subordinal relationships at the genome level. Phylogenetic reconstructions were conducted using up to 1470 orthologous genes and 634,288 aligned sites. We found strong support for the Yinpterochiroptera-Yangochiroptera classification. Next, we built expression distance matrices for each species and reconstructed gene expression trees. The tree is highly consistent with sequence-based phylogeny. We also examined the influence of taxa sampling on the performance of phylogenetic methods, and found that the topology is robust to sampling. Relaxed molecular clock estimates the divergence between Yinpterochiroptera and Yangochiroptera around 63 million years ago. The most recent common ancestor of Yinpterochiroptera, corresponding to the split between Rhinolophoidea and Pteropodidae (Old World Fruit bats), is estimated to have occurred 60 million years ago. Our work provided a valuable resource to further explore the evolutionary relationship within bats. PMID:27291671

  5. DNA-energetics-based analyses suggest additional genes in prokaryotes

    Indian Academy of Sciences (India)

    Garima Khandelwal; Jalaj Gupta; B Jayaram

    2012-07-01

    We present here a novel methodology for predicting new genes in prokaryotic genomes on the basis of inherent energetics of DNA. Regions of higher thermodynamic stability were identified, which were filtered based on already known annotations to yield a set of potentially new genes. These were then processed for their compatibility with the stereo-chemical properties of proteins and tripeptide frequencies of proteins in Swissprot data, which results in a reliable set of new genes in a genome. Quite surprisingly, the methodology identifies new genes even in well-annotated genomes. Also, the methodology can handle genomes of any GC-content, size and number of annotated genes.

  6. Design of the storage location based on the ABC analyses

    Science.gov (United States)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  7. Hilbert transform based analyses on ship-rocking signals

    CERN Document Server

    Huang, Wei; Kang, Deyong; Chen, Zhi

    2015-01-01

    The ship-rocking is a crucial factor which affects the accuracy of the ocean-based flight vehicle measurement. Here we have analyzed four groups of ship-rocking time series in horizontal and vertical directions utilizing a Hilbert based method from statistical physics. Our method gives a way to construct an analytic signal on the two-dimensional plane from a one-dimensional time series. The analytic signal share the complete property of the original time series. From the analytic signal of a time series, we have found some information of the original time series which are often hidden from the view of the conventional methods. The analytic signals of interest usually evolve very smoothly on the complex plane. In addition, the phase of the analytic signal is usually moves linearly in time. From the auto-correlation and cross-correlation functions of the original signals as well as the instantaneous amplitudes and phase increments of the analytic signals we have found that the ship-rocking in horizontal directi...

  8. Activity-based analyses lead to better decision making.

    Science.gov (United States)

    Player, S

    1998-08-01

    Activity-based costing (ABC) and activity-based management (ABM) are cost-management tools that are relatively new to the healthcare industry. ABC is used for strategic decision making. It assesses the costs associated with specific activities and resources and links those costs to specific internal and external customers of the healthcare enterprise (e.g., patients, service lines, and physician groups) to determine the costs associated with each customer. This cost information then can be adjusted to account for anticipated changes and to predict future costs. ABM, on the other hand, supports operations by focusing on the causes of costs and how costs can be reduced. It assesses cost drivers that directly affect the cost of a product or service, and uses performance measures to evaluate the financial or nonfinancial benefit an activity provides. By identifying each cost driver and assessing the value the element adds to the healthcare enterprise, ABM provides a basis for selecting areas that can be changed to reduce costs. PMID:10182280

  9. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    Over the past quarter-century, microbiologists have used DNA sequence information to aid in the characterization of microbial communities. During the last decade, this has expanded from single genes to microbial community genomics, or metagenomics, in which the gene content of an environment can...... provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... by estimating average genome sizes. This normalization can relieve comparative biases introduced by differences in community structure, number of sequencing reads, and sequencing read lengths between different metagenomes. We demonstrate the utility of this approach by comparing metagenomes from two different...

  10. Single-cell-based sensors and synchrotron FTIR spectroscopy: a hybrid system towards bacterial detection.

    Science.gov (United States)

    Veiseh, Mandana; Veiseh, Omid; Martin, Michael C; Bertozzi, Carolyn; Zhang, Miqin

    2007-09-30

    Microarrays of single macrophage cell-based sensors were developed and demonstrated for potential real-time bacterium detection by synchrotron FTIR microscopy. The cells were patterned on gold electrodes of silicon oxide substrates by a surface engineering technique, in which the gold electrodes were immobilized with fibronectin to mediate cell adhesion and the silicon oxide background was passivated with polyethylene glycol (PEG) to resist protein adsorption and cell adhesion. Cell morphology and IR spectra of single, double, and triple cells on gold electrodes exposed to lipopolysaccharide (LPS) of different concentrations were compared to reveal the detection capability of this cell-based sensing platform. The single-cell-based system was found to generate the most significant and consistent IR spectrum shifts upon exposure to LPS, thus providing the highest detection sensitivity. Changes in cell morphology and IR shifts upon cell exposure to LPS were found to be dependent on the LPS concentration and exposure time, which established a method for the identification of LPS concentration and infected cell population. Possibility of using this single-cell system with conventional IR spectroscopy as well as its limitation was investigated by comparing IR spectra of single-cell arrays with gold electrode surface areas of 25, 100, and 400 microm2 using both synchrotron and conventional FTIR spectromicroscopes. This cell-based platform may potentially provide real-time, label-free, and rapid bacterial detection, and allow for high-throughput statistical analyses, and portability. PMID:17560777

  11. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    OpenAIRE

    Chu, Annie; Cui, Jenny; Ivo D. Dinov

    2009-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses i...

  12. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    OpenAIRE

    Pei-Yuan Li; Chu-Wei Gu; Yin Song

    2015-01-01

    This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1) centrifugal compressor geometry optimization based on one-dimensional calculations and (2) matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D...

  13. A Framework for Analysing Textbooks Based on the Notion of Abstraction

    Science.gov (United States)

    Yang, Kai-Lin

    2013-01-01

    Abstraction is a key adaptive mechanism of human cognition and an essential process in the personal construction of mathematical knowledge. Based on the notion of abstraction, this paper aims to conceptualise a framework for analysing textbooks. First, I search for the meaning of abstraction from a constructive-empirical and a dialectic…

  14. Use of Tree-Based Regression in the Analyses of L2 Reading Test Items

    Science.gov (United States)

    Gao, Lingyun; Rogers, W. Todd

    2011-01-01

    The purpose of this study was to explore whether the results of Tree Based Regression (TBR) analyses, informed by a validated cognitive model, would enhance the interpretation of item difficulties in terms of the cognitive processes involved in answering the reading items included in two forms of the Michigan English Language Assessment Battery…

  15. Real-time Bacterial Detection by Single Cell Based Sensors UsingSynchrotron FTIR Spectromicroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Veiseh, Mandana; Veiseh, Omid; Martin, Michael C.; Bertozzi,Carolyn; Zhang, Miqin

    2005-08-10

    Microarrays of single macrophage cell based sensors weredeveloped and demonstrated for real time bacterium detection bysynchrotron FTIR microscopy. The cells were patterned on gold-SiO2substrates via a surface engineering technique by which the goldelectrodes were immobilized with fibronectin to mediate cell adhesion andthe silicon oxide background were passivated with PEG to resist proteinadsorption and cell adhesion. Cellular morphology and IR spectra ofsingle, double, and triple cells on gold electrodes exposed tolipopolysaccharide (LPS) of different concentrations were compared toreveal the detection capabilities of these biosensors. The single-cellbased sensors were found to generate the most significant IR wave numbervariation and thus provide the highest detection sensitivity. Changes inmorphology and IR spectrum for single cells exposed to LPS were found tobe time- and concentration-dependent and correlated with each other verywell. FTIR spectra from single cell arrays of gold electrodes withsurface area of 25 mu-m2, 100 mu-m2, and 400 mu-m2 were acquired usingboth synchrotron and conventional FTIR spectromicroscopes to study thesensitivity of detection. The results indicated that the developedsingle-cell platform can be used with conventional FTIRspectromicroscopy. This technique provides real-time, label-free, andrapid bacterial detection, and may allow for statistic and highthroughput analyses, and portability.

  16. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  17. SURFACE SUBSIDENCE ANALYSES BASED ON THE PRINCIPLE OF EXPANSION AND RECONSOLIDATION OF THE BROKEN ROCK STRATA

    Institute of Scientific and Technical Information of China (English)

    王悦汉; 缪协兴

    1997-01-01

    The results of experimental studies about the characteristics of broken rock expansion and reconsolidation were briefly introduced in this paper, and the surface subsidence coefficient under critical mining conditions was also analysed based on the principle of expansion and reconsolidation of the broken rock strata, a equation to calculate the corresponding surface subsidence was finally produced. This calculation method can be used to calculate more accurately the convergence quantity of consolidated rocks in the broken zone of the working face. In addition, case analyses by using the introduced calculation method were conducted and satisfactory results were obtained.

  18. Plant trials of an on-stream iron ore analyser based on pair production

    International Nuclear Information System (INIS)

    An on-stream iron ore analyser, called Ironscan, has been developed in collaboration with Hamersley Iron Pty Limited for measuring the iron content of iron ore on conveyor belts. The analyser is based on pair production and irradiates the ore with gamma rays from a 226Ra source. A big advantage of the analyser is that it can be mounted under existing conveyor belts with minimal modifications to the conveyor structure, and the presence of steel reinforcement cables in the belt does not interfere once the analyser has been correctly calibrated. After dynamic trials of a laboratory prototype on a small conveyor facility at Port Melbourne to demonstrate the viability of the method for lumps (-30 + 6 mm particle size) and fines (-6 mm particle size), an industrial prototype was manufactured by Mineral Control Instrumentation Limited in Adelaide (now the Commonwealth Scientific and Industrial Research Organisation licensee) and extensively tested at the Hamersley Iron operations in Dampier and Mount Tom Price, Western Australia. At Dampier, it was installed on the main shiploading conveyor to assess its performance on the normal lumps and fines that are exported. The root mean square (RMS) deviation between single Ironscan measurements and conventional chemical analyses was about 0.4% Fe. The analyser was then evaluated on -150 mm ore from the primary crusher at Mount Tom Price. While it was clear that ore grade could be measured even at this coarse particle size, it was very difficult to obtain comparable chemical analyses and there was insufficient data to estimate the RMS deviation. (author). 6 refs, 7 figs, 1 tab

  19. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  20. Estimation of effective block conductivities based on discrete network analyses using data from the Aespoe site

    International Nuclear Information System (INIS)

    Numerical continuum codes may be used for assessing the role of regional groundwater flow in far-field safety analyses of a nuclear waste repository at depth. The focus of this project is to develop and evaluate one method based on Discrete Fracture Network (DFN) models to estimate block-scale permeability values for continuum codes. Data from the Aespoe HRL and surrounding area are used. 57 refs, 76 figs, 15 tabs

  1. Comparing Surface-Based and Volume-Based Analyses of Functional Neuroimaging Data in Patients with Schizophrenia

    OpenAIRE

    Anticevic, Alan; Dierker, Donna L.; Gillespie, Sarah K.; Repovs, Grega; Csernansky, John G.; Van Essen, David C.; Deanna M Barch

    2008-01-01

    A major challenge in functional neuroimaging is to cope with individual variability in cortical structure and function. Most analyses of cortical function compensate for variability using affine or low-dimensional nonlinear volume-based registration (VBR) of individual subjects to an atlas, which does not explicitly take into account the geometry of cortical convolutions. A promising alternative is to use surface-based registration (SBR), which capitalizes on explicit surface representations ...

  2. Physical characterization of biomass-based pyrolysis liquids. Application of standard fuel oil analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oasmaa, A.; Leppaemaeki, E.; Koponen, P.; Levander, J.; Tapola, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1997-12-31

    The main purpose of the study was to test the applicability of standard fuel oil methods developed for petroleum-based fuels to pyrolysis liquids. In addition, research on sampling, homogeneity, stability, miscibility and corrosivity was carried out. The standard methods have been tested for several different pyrolysis liquids. Recommendations on sampling, sample size and small modifications of standard methods are presented. In general, most of the methods can be used as such but the accuracy of the analysis can be improved by minor modifications. Fuel oil analyses not suitable for pyrolysis liquids have been identified. Homogeneity of the liquids is the most critical factor in accurate analysis. The presence of air bubbles may disturb in several analyses. Sample preheating and prefiltration should be avoided when possible. The former may cause changes in the composition and structure of the pyrolysis liquid. The latter may remove part of organic material with particles. The size of the sample should be determined on the basis of the homogeneity and the water content of the liquid. The basic analyses of the Technical Research Centre of Finland (VTT) include water, pH, solids, ash, Conradson carbon residue, heating value, CHN, density, viscosity, pourpoint, flash point, and stability. Additional analyses are carried out when needed. (orig.) 53 refs.

  3. Methodology for Web Services Adoption Based on Technology Adoption Theory and Business Process Analyses

    Institute of Scientific and Technical Information of China (English)

    AN Liping; YAN Jianyuan; TONG Lingyun

    2008-01-01

    Web services use an emerging service-oriented architecture for distributed computing. Many organizations are either in the process of adopting web services technology or evaluating this option for incorporation into their enterprise information architectures. Implementation of this new technology requires careful assessment of the needs and capabilities of an organization to formulate adoption strategies. This paper presents a methodology for web services adoption based on technology adoption theory and business process analyses. The methodology suggests that strategies, business areas, and functions within an organization should be considered based on the existing organizational information technology status during the process of adopting web services to support the business needs and requirements.

  4. A Python-based Post-processing Toolset For Seismic Analyses

    CERN Document Server

    Brasier, Steve

    2014-01-01

    This paper discusses the design and implementation of a Python-based toolset to aid in assessing the response of the UK's Advanced Gas Reactor nuclear power stations to earthquakes. The seismic analyses themselves are carried out with a commercial Finite Element solver, but understanding the raw model output this produces requires customised post-processing and visualisation tools. Extending the existing tools had become increasingly difficult and a decision was made to develop a new, Python-based toolset. This comprises of a post-processing framework (aftershock) which includes an embedded Python interpreter, and a plotting package (afterplot) based on numpy and matplotlib. The new toolset had to be significantly more flexible and easier to maintain than the existing code-base, while allowing the majority of development to be carried out by engineers with little training in software development. The resulting architecture will be described with a focus on exploring how the design drivers were met and the suc...

  5. A New Optimization Method for Centrifugal Compressors Based on 1D Calculations and Analyses

    Directory of Open Access Journals (Sweden)

    Pei-Yuan Li

    2015-05-01

    Full Text Available This paper presents an optimization design method for centrifugal compressors based on one-dimensional calculations and analyses. It consists of two parts: (1 centrifugal compressor geometry optimization based on one-dimensional calculations and (2 matching optimization of the vaned diffuser with an impeller based on the required throat area. A low pressure stage centrifugal compressor in a MW level gas turbine is optimized by this method. One-dimensional calculation results show that D3/D2 is too large in the original design, resulting in the low efficiency of the entire stage. Based on the one-dimensional optimization results, the geometry of the diffuser has been redesigned. The outlet diameter of the vaneless diffuser has been reduced, and the original single stage diffuser has been replaced by a tandem vaned diffuser. After optimization, the entire stage pressure ratio is increased by approximately 4%, and the efficiency is increased by approximately 2%.

  6. Data on phylogenetic analyses of gazelles (genus Gazella) based on mitochondrial and nuclear intron markers.

    Science.gov (United States)

    Lerp, Hannes; Klaus, Sebastian; Allgöwer, Stefanie; Wronski, Torsten; Pfenninger, Markus; Plath, Martin

    2016-06-01

    The data provided is related to the article "Phylogenetic analyses of gazelles reveal repeated transitions of key ecological traits and provide novel insights into the origin of the genus Gazella" [1]. The data is based on 48 tissue samples of all nine extant species of the genus Gazella, namely Gazella gazella, Gazella arabica, Gazella bennettii, Gazella cuvieri, Gazella dorcas, Gazella leptoceros, Gazella marica, Gazella spekei, and Gazella subgutturosa and four related taxa (Saiga tatarica, Antidorcas marsupialis, Antilope cervicapra and Eudorcas rufifrons). It comprises alignments of sequences of a cytochrome b data set and of six nuclear intron markers. For the latter new primers were designed based on cattle and sheep genomes. Based on these alignments phylogenetic trees were inferred using Bayesian Inference and Maximum Likelihood methods. Furthermore, ancestral character states (inferred with BayesTraits 1.0) and ancestral ranges based on a Dispersal-Extinction-Cladogenesis model were estimated and results׳ files were stored within this article. PMID:27054158

  7. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  8. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  9. Comparison between CARIBIC aerosol samples analysed by accelerator-based methods and optical particle counter measurements

    Directory of Open Access Journals (Sweden)

    B. G. Martinsson

    2014-04-01

    Full Text Available Inter-comparison of results from two kinds of aerosol systems in the CARIBIC (Civil Aircraft for the Regular Investigation of the atmosphere Based on an Instrument Container passenger aircraft based observatory, operating during intercontinental flights at 9–12 km altitude, is presented. Aerosol from the lowermost stratosphere (LMS, the extra-tropical upper troposphere (UT and the tropical mid troposphere (MT were investigated. Aerosol particle volume concentration measured with an optical particle counter (OPC is compared with analytical results of the sum of masses of all major and several minor constituents from aerosol samples collected with an impactor. Analyses were undertaken with accelerator-based methods particle-induced X-ray emission (PIXE and particle elastic scattering analysis (PESA. Data from 48 flights during one year are used, leading to a total of 106 individual comparisons. The ratios of the particle volume from the OPC and the total mass from the analyses were in 84% within a relatively narrow interval. Data points outside this interval are connected with inlet-related effects in clouds, large variability in aerosol composition, particle size distribution effects and some cases of non-ideal sampling. Overall, the comparison of these two CARIBIC measurements based on vastly different methods show good agreement, implying that the chemical and size information can be combined in studies of the MT/UT/LMS aerosol.

  10. The modulation of spatial congruency by object-based attention: analysing the "locus" of the modulation.

    Science.gov (United States)

    Luo, Chunming; Lupiáñez, Juan; Funes, María Jesús; Fu, Xiaolan

    2011-12-01

    Earlier studies have demonstrated that spatial cueing differentially reduces stimulus-stimulus congruency (e.g., spatial Stroop) interference but not stimulus-response congruency (e.g., Simon; e.g., Lupiáñez & Funes, 2005). This spatial cueing modulation over spatial Stroop seems to be entirely attributable to object-based attention (e.g., Luo, Lupiáñez, Funes, & Fu, 2010). In the present study, two experiments were conducted to further explore whether the cueing modulation of spatial Stroop is object based and/or space based and to analyse the "locus" of this modulation. In Experiment 1, we found that the cueing modulation over spatial Stroop is entirely object based, independent of stimulus-response congruency. In Experiment 2, we observed that the modulation of object-based attention over the spatial Stroop only occurred at a short cue-target interval (i.e., stimulus onset asynchrony; SOA), whereas the stimulus-response congruency effect was not modulated either by object-based or by location-based attentional cueing. The overall pattern of results suggests that the spatial cueing modulation over spatial Stroop arises from object-based attention and occurs at the perceptual stage of processing. PMID:21923623

  11. Differentiating LGBT individuals in substance abuse treatment: analyses based on sexuality and drug preference.

    Science.gov (United States)

    Cochran, Bryan N; Peavy, K Michelle; Santa, Annesa Flentje

    2007-01-01

    In a prior study (Cochran & Cauce, 2006), LGBT individuals seeking treatment demonstrated greater substance use severity, more psychosocial stressors, and increased use of psychiatric services when compared to their heterosexual counterparts. That study, and similar to others in the field of LGBT research, collapsed LGBT individuals into a single category and did not examine individual differences within this category. The present study utilizes the same sample of LGBT clients (N = 610); however, an exploratory cluster analysis was conducted, based on drug preference, to determine which subcategories exist within this unique sample. In a subsequent set of analyses, the sample was divided based on sexuality to determine if there were differences between these groups on psychosocial functioning variables. Results indicated three distinct clusters, which differed in both demographic characteristics and severity of substance use problems. Groups based on sexuality differed in terms of primary problem substance, as well as psychosocial variables. Implications for treatment of these subgroups are discussed. PMID:19835042

  12. A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Asteriadis, Stylianos;

    2011-01-01

    Recognizing players' aective state while playing video games has been the focus of many recent research studies. In this paper we describe the process that has been followed to build a corpus based on game events and recorded video sessions from human players while playing Super Mario Bros. We...... present dierent types of information that have been extracted from game context, player preferences and perception of the game, as well as user features, automatically extracted from video recordings.We run a number of initial experiments to analyse players' behavior while playing video games as a case...

  13. Genome-based comparative analyses of Antarctic and temperate species of Paenibacillus.

    Directory of Open Access Journals (Sweden)

    Melissa Dsouza

    Full Text Available Antarctic soils represent a unique environment characterised by extremes of temperature, salinity, elevated UV radiation, low nutrient and low water content. Despite the harshness of this environment, members of 15 bacterial phyla have been identified in soils of the Ross Sea Region (RSR. However, the survival mechanisms and ecological roles of these phyla are largely unknown. The aim of this study was to investigate whether strains of Paenibacillus darwinianus owe their resilience to substantial genomic changes. For this, genome-based comparative analyses were performed on three P. darwinianus strains, isolated from gamma-irradiated RSR soils, together with nine temperate, soil-dwelling Paenibacillus spp. The genome of each strain was sequenced to over 1,000-fold coverage, then assembled into contigs totalling approximately 3 Mbp per genome. Based on the occurrence of essential, single-copy genes, genome completeness was estimated at approximately 88%. Genome analysis revealed between 3,043-3,091 protein-coding sequences (CDSs, primarily associated with two-component systems, sigma factors, transporters, sporulation and genes induced by cold-shock, oxidative and osmotic stresses. These comparative analyses provide an insight into the metabolic potential of P. darwinianus, revealing potential adaptive mechanisms for survival in Antarctic soils. However, a large proportion of these mechanisms were also identified in temperate Paenibacillus spp., suggesting that these mechanisms are beneficial for growth and survival in a range of soil environments. These analyses have also revealed that the P. darwinianus genomes contain significantly fewer CDSs and have a lower paralogous content. Notwithstanding the incompleteness of the assemblies, the large differences in genome sizes, determined by the number of genes in paralogous clusters and the CDS content, are indicative of genome content scaling. Finally, these sequences are a resource for further

  14. Wavelet-based spatial comparison technique for analysing and evaluating two-dimensional geophysical model fields

    Directory of Open Access Journals (Sweden)

    S. Saux Picart

    2011-11-01

    Full Text Available Complex numerical models of the Earth's environment, based around 3-D or 4-D time and space domains are routinely used for applications including climate predictions, weather forecasts, fishery management and environmental impact assessments. Quantitatively assessing the ability of these models to accurately reproduce geographical patterns at a range of spatial and temporal scales has always been a difficult problem to address. However, this is crucial if we are to rely on these models for decision making. Satellite data are potentially the only observational dataset able to cover the large spatial domains analysed by many types of geophysical models. Consequently optical wavelength satellite data is beginning to be used to evaluate model hindcast fields of terrestrial and marine environments. However, these satellite data invariably contain regions of occluded or missing data due to clouds, further complicating or impacting on any comparisons with the model. A methodology has recently been developed to evaluate precipitation forecasts using radar observations. It allows model skill to be evaluated at a range of spatial scales and rain intensities. Here we extend the original method to allow its generic application to a range of continuous and discontinuous geophysical data fields, and therefore allowing its use with optical satellite data. This is achieved through two major improvements to the original method: (i all thresholds are determined based on the statistical distribution of the input data, so no a priori knowledge about the model fields being analysed is required and (ii occluded data can be analysed without impacting on the metric results. The method can be used to assess a model's ability to simulate geographical patterns over a range of spatial scales. We illustrate how the method provides a compact and concise way of visualising the degree of agreement between spatial features in two datasets. The application of the new method, its

  15. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    International Nuclear Information System (INIS)

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an ''importance factor'' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor

  16. FluxExplorer: A general platform for modeling and analyses of metabolic networks based on stoichiometry

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Stoichiometry-based analyses of meta- bolic networks have aroused significant interest of systems biology researchers in recent years. It is necessary to develop a more convenient modeling platform on which users can reconstruct their network models using completely graphical operations, and explore them with powerful analyzing modules to get a better understanding of the properties of metabolic systems. Herein, an in silico platform, FluxExplorer, for metabolic modeling and analyses based on stoichiometry has been developed as a publicly available tool for systems biology research. This platform integrates various analytic approaches, in- cluding flux balance analysis, minimization of meta- bolic adjustment, extreme pathways analysis, shadow prices analysis, and singular value decom- position, providing a thorough characterization of the metabolic system. Using a graphic modeling process, metabolic networks can be reconstructed and modi- fied intuitively and conveniently. The inconsistencies of a model with respect to the FBA principles can be proved automatically. In addition, this platform sup- ports systems biology markup language (SBML). FluxExplorer has been applied to rebuild a metabolic network in mammalian mitochondria, producing meaningful results. Generally, it is a powerful and very convenient tool for metabolic network modeling and analysis.

  17. Performance Analyses of Renewable and Fuel Power Supply Systems for Different Base Station Sites

    Directory of Open Access Journals (Sweden)

    Josip Lorincz

    2014-11-01

    Full Text Available Base station sites (BSSs powered with renewable energy sources have gained the attention of cellular operators during the last few years. This is because such “green” BSSs impose significant reductions in the operational expenditures (OPEX of telecom operators due to the possibility of on-site renewable energy harvesting. In this paper, the green BSSs power supply system parameters detected through remote and centralized real time sensing are presented. An implemented sensing system based on a wireless sensor network enables reliable collection and post-processing analyses of many parameters, such as: total charging/discharging current of power supply system, battery voltage and temperature, wind speed, etc. As an example, yearly sensing results for three different BSS configurations powered by solar and/or wind energy are discussed in terms of renewable energy supply (RES system performance. In the case of powering those BSS with standalone systems based on a fuel generator, the fuel consumption models expressing interdependence among the generator load and fuel consumption are proposed. This has allowed energy-efficiency comparison of the fuel powered and RES systems, which is presented in terms of the OPEX and carbon dioxide (CO2 reductions. Additionally, approaches based on different BSS air-conditioning systems and the on/off regulation of a daily fuel generator activity are proposed and validated in terms of energy and capital expenditure (CAPEX savings.

  18. Molecular systematics of Volvocales (Chlorophyceae, Chlorophyta) based on exhaustive 18S rRNA phylogenetic analyses.

    Science.gov (United States)

    Nakada, Takashi; Misawa, Kazuharu; Nozaki, Hisayoshi

    2008-07-01

    The taxonomy of Volvocales (Chlorophyceae, Chlorophyta) was traditionally based solely on morphological characteristics. However, because recent molecular phylogeny largely contradicts the traditional subordinal and familial classifications, no classification system has yet been established that describes the subdivision of Volvocales in a manner consistent with the phylogenetic relationships. Towards development of a natural classification system at and above the generic level, identification and sorting of hundreds of sequences based on subjective phylogenetic definitions is a significant step. We constructed an 18S rRNA gene phylogeny based on 449 volvocalean sequences collected using exhaustive BLAST searches of the GenBank database. Many chimeric sequences, which can cause fallacious phylogenetic trees, were detected and excluded during data collection. The results revealed 21 strongly supported primary clades within phylogenetically redefined Volvocales. Phylogenetic classification following PhyloCode was proposed based on the presented 18S rRNA gene phylogeny along with the results of previous combined 18S and 26S rRNA and chloroplast multigene analyses. PMID:18430591

  19. Wavelet-based spatial comparison technique for analysing and evaluating two-dimensional geophysical model fields

    Directory of Open Access Journals (Sweden)

    S. Saux Picart

    2012-02-01

    Full Text Available Complex numerical models of the Earth's environment, based around 3-D or 4-D time and space domains are routinely used for applications including climate predictions, weather forecasts, fishery management and environmental impact assessments. Quantitatively assessing the ability of these models to accurately reproduce geographical patterns at a range of spatial and temporal scales has always been a difficult problem to address. However, this is crucial if we are to rely on these models for decision making. Satellite data are potentially the only observational dataset able to cover the large spatial domains analysed by many types of geophysical models. Consequently optical wavelength satellite data is beginning to be used to evaluate model hindcast fields of terrestrial and marine environments. However, these satellite data invariably contain regions of occluded or missing data due to clouds, further complicating or impacting on any comparisons with the model. This work builds on a published methodology, that evaluates precipitation forecast using radar observations based on predefined absolute thresholds. It allows model skill to be evaluated at a range of spatial scales and rain intensities. Here we extend the original method to allow its generic application to a range of continuous and discontinuous geophysical data fields, and therefore allowing its use with optical satellite data. This is achieved through two major improvements to the original method: (i all thresholds are determined based on the statistical distribution of the input data, so no a priori knowledge about the model fields being analysed is required and (ii occluded data can be analysed without impacting on the metric results. The method can be used to assess a model's ability to simulate geographical patterns over a range of spatial scales. We illustrate how the method provides a compact and concise way of visualising the degree of agreement between spatial features in two

  20. Quantitative and qualitative validations of a sonication-based DNA extraction approach for PCR-based molecular biological analyses.

    Science.gov (United States)

    Dai, Xiaohu; Chen, Sisi; Li, Ning; Yan, Han

    2016-05-15

    The aim of this study was to comprehensively validate the sonication-based DNA extraction method, in hope of the replacement of the so-called 'standard DNA extraction method' - the commercial kit method. Microbial cells in the digested sludge sample, containing relatively high amount of PCR-inhibitory substances, such as humic acid and protein, were applied as the experimental alternatives. The procedure involving solid/liquid separation of sludge sample and dilution of both DNA templates and inhibitors, the minimum templates for PCR-based analyses, and the in-depth understanding from the bias analysis by pyrosequencing technology were obtained and confirmed the availability of the sonication-based DNA extraction method. PMID:26774955

  1. Neutral particle energy analyser based on time of flight technique for EXTRAP-T2R

    Energy Technology Data Exchange (ETDEWEB)

    Cecconello, M. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Fusion Plasma Physics; Costa, S.; Murari, A.; Barzon, A. [Consorzio RFX, Padova (Italy)

    2001-07-01

    An important and not well-understood problem in the Reversed Field Pinch configuration is the anomalous ion heating. In all Ohmically heated RFPs, the ion temperature has been experimentally observed to be higher than can be accounted for by equilibration of energy from an Ohmically heated electron population. The mechanism driving the ions to such high energies is still debated. Different possible explanations have been investigated: kinetic Alfven waves turbulence, MHD relaxation processes and the excitation of an ion electrostatic wave by supra-thermal electrons. The measurement of the ion temperature is important in order to evaluate confinement. Furthermore, measurements can provide information on the mechanism behind the anomalous ion heating. The ion temperature is calculated from the neutral particles energy spectrum obtained by a neutral particles energy analyser based on the time of flight specifically developed for EXTRAP - T2R and here described in detail.

  2. Conformational determination of [Leu]enkephalin based on theoretical and experimental VA and VCD spectral analyses

    DEFF Research Database (Denmark)

    Abdali, Salim; Jalkanen, Karl J.; Cao, X.;

    2004-01-01

    Conformational determination of [Leu]enkephalin in DMSO-d6 is carried out using VA and VCD spectral analyses. Conformational energies, vibrational frequencies and VA and VCD intensities are calculated using DFT at B3LYP/6-31G* level of theory. Comparison between the measured spectra and the...... spectral simulations of the low energy conformations of the neutral species enable conformational determination of the molecule. In an earlier study, based only on VA spectroscopy, the results led to the conclusion that [Leu]enkephalin had only a single b-bend conformation of the neutral species in DMSO-d6...... . The present work shows the importance of using VCD in addition to VA in determining the conformation of chiral molecules and which one of the examined three single b-bend structures is the most stable in DMSO-d6 ....

  3. Deconvoluting complex tissues for expression quantitative trait locus-based analyses

    DEFF Research Database (Denmark)

    Seo, Ji-Heui; Li, Qiyuan; Fatima, Aquila;

    2013-01-01

    eQTL-based analyses in human samples is complicated because of the heterogeneous nature of human tissue. We addressed this issue by devising a method to computationally infer the fraction of cell types in normal human breast tissues. We then applied this method to 13 known breast cancer risk loci......, which we hypothesized were eQTLs. For each risk locus, we took all known transcripts within a 2 Mb interval and performed an eQTL analysis in 100 reduction mammoplasty cases. A total of 18 significant associations were discovered (eight in the epithelial compartment and 10 in the stromal compartment......). This study highlights the ability to perform large-scale eQTL studies in heterogeneous tissues....

  4. Toxicity testing and chemical analyses of recycled fibre-based paper for food contact

    DEFF Research Database (Denmark)

    Binderup, Mona-Lise; Pedersen, Gitte Alsing; Vinggaard, Anne;

    2002-01-01

    Food-contact materials, including paper, have to comply with a basic set of criteria concerning safety. This means that paper for food contact should not give rise to migration of components, which can endanger human health. The objectives of this pilot study were, first, to compare paper of...... different qualities as food-contact materials and to Perform a preliminary evaluation of their suitability from a safety point of view, and, second, to evaluate the use of different in vitro toxicity tests for screening of paper and board. Paper produced from three different categories of recycled fibres (B......-D) and a raw material produced from virgin fibres (A) were obtained from industry, and extracts were examined by chemical analyses and diverse in vitro toxicity test systems. The products tested were either based on different raw materials or different treatments were applied. Paper category B was made...

  5. Neutral particle energy analyser based on time of flight technique for EXTRAP-T2R

    International Nuclear Information System (INIS)

    An important and not well-understood problem in the Reversed Field Pinch configuration is the anomalous ion heating. In all Ohmically heated RFPs, the ion temperature has been experimentally observed to be higher than can be accounted for by equilibration of energy from an Ohmically heated electron population. The mechanism driving the ions to such high energies is still debated. Different possible explanations have been investigated: kinetic Alfven waves turbulence, MHD relaxation processes and the excitation of an ion electrostatic wave by supra-thermal electrons. The measurement of the ion temperature is important in order to evaluate confinement. Furthermore, measurements can provide information on the mechanism behind the anomalous ion heating. The ion temperature is calculated from the neutral particles energy spectrum obtained by a neutral particles energy analyser based on the time of flight specifically developed for EXTRAP - T2R and here described in detail

  6. Static and dynamic stress analyses of the prototype high head Francis runner based on site measurement

    Science.gov (United States)

    Huang, X.; Oram, C.; Sick, M.

    2014-03-01

    More efforts are put on hydro-power to balance voltage and frequency within seconds for primary control in modern smart grids. This requires hydraulic turbines to run at off-design conditions. especially at low load or speed-no load. Besides. the tendency of increasing power output and decreasing weight of the turbine runners has also led to the high level vibration problem of the runners. especially high head Francis runners. Therefore. it is important to carry out the static and dynamic stress analyses of prototype high head Francis runners. This paper investigates the static and dynamic stresses on the prototype high head Francis runner based on site measurements and numerical simulations. The site measurements are performed with pressure transducers and strain gauges. Based on the measured results. computational fluid dynamics (CFD) simulations for the flow channel from stay vane to draft tube cone are performed. Static pressure distributions and dynamic pressure pulsations caused by rotor-stator interaction (RSI) are obtained under various operating conditions. With the CFD results. static and dynamic stresses on the runner at different operating points are calculated by means of the finite element method (FEM). The agreement between simulation and measurement is analysed with linear regression method. which indicates that the numerical result agrees well with that of measurement. Furthermore. the maximum static and dynamic stresses on the runner blade are obtained at various operating points. The relations of the maximum stresses and the power output are discussed in detail. The influences of the boundary conditions on the structural behaviour of the runner are also discussed.

  7. Static and dynamic stress analyses of the prototype high head Francis runner based on site measurement

    International Nuclear Information System (INIS)

    More efforts are put on hydro-power to balance voltage and frequency within seconds for primary control in modern smart grids. This requires hydraulic turbines to run at off-design conditions. especially at low load or speed-no load. Besides. the tendency of increasing power output and decreasing weight of the turbine runners has also led to the high level vibration problem of the runners. especially high head Francis runners. Therefore. it is important to carry out the static and dynamic stress analyses of prototype high head Francis runners. This paper investigates the static and dynamic stresses on the prototype high head Francis runner based on site measurements and numerical simulations. The site measurements are performed with pressure transducers and strain gauges. Based on the measured results. computational fluid dynamics (CFD) simulations for the flow channel from stay vane to draft tube cone are performed. Static pressure distributions and dynamic pressure pulsations caused by rotor-stator interaction (RSI) are obtained under various operating conditions. With the CFD results. static and dynamic stresses on the runner at different operating points are calculated by means of the finite element method (FEM). The agreement between simulation and measurement is analysed with linear regression method. which indicates that the numerical result agrees well with that of measurement. Furthermore. the maximum static and dynamic stresses on the runner blade are obtained at various operating points. The relations of the maximum stresses and the power output are discussed in detail. The influences of the boundary conditions on the structural behaviour of the runner are also discussed

  8. Integrated optimization analyses of aerodynamic/stealth characteristics of helicopter rotor based on surrogate model

    Directory of Open Access Journals (Sweden)

    Jiang Xiangwen

    2015-06-01

    Full Text Available Based on computational fluid dynamics (CFD method, electromagnetic high-frequency method and surrogate model optimization techniques, an integration design method about aerodynamic/stealth has been established for helicopter rotor. The developed integration design method is composed of three modules: integrated grids generation (the moving-embedded grids for CFD solver and the blade grids for radar cross section (RCS solver are generated by solving Poisson equations and folding approach, aerodynamic/stealth solver (the aerodynamic characteristics are simulated by CFD method based upon Navier–Stokes equations and Spalart–Allmaras (S–A turbulence model, and the stealth characteristics are calculated by using a panel edge method combining the method of physical optics (PO, equivalent currents (MEC and quasi-stationary (MQS, and integrated optimization analysis (based upon the surrogate model optimization technique with full factorial design (FFD and radial basis function (RBF, an integrated optimization analyses on aerodynamic/stealth characteristics of rotor are conducted. Firstly, the scattering characteristics of the rotor with different blade-tip swept and twist angles have been carried out, then time–frequency domain grayscale with strong scattering regions of rotor have been given. Meanwhile, the effects of swept-tip and twist angles on the aerodynamic characteristic of rotor have been performed. Furthermore, by choosing suitable object function and constraint condition, the compromised design about swept and twist combinations of rotor with high aerodynamic performances and low scattering characteristics has been given at last.

  9. Microarray analyses and comparisons of upper or lower flanks of rice shoot base preceding gravitropic bending.

    Directory of Open Access Journals (Sweden)

    Liwei Hu

    Full Text Available Gravitropism is a complex process involving a series of physiological pathways. Despite ongoing research, gravitropism sensing and response mechanisms are not well understood. To identify the key transcripts and corresponding pathways in gravitropism, a whole-genome microarray approach was used to analyze transcript abundance in the shoot base of rice (Oryza sativa sp. japonica at 0.5 h and 6 h after gravistimulation by horizontal reorientation. Between upper and lower flanks of the shoot base, 167 transcripts at 0.5 h and 1202 transcripts at 6 h were discovered to be significantly different in abundance by 2-fold. Among these transcripts, 48 were found to be changed both at 0.5 h and 6 h, while 119 transcripts were only changed at 0.5 h and 1154 transcripts were changed at 6 h in association with gravitropism. MapMan and PageMan analyses were used to identify transcripts significantly changed in abundance. The asymmetric regulation of transcripts related to phytohormones, signaling, RNA transcription, metabolism and cell wall-related categories between upper and lower flanks were demonstrated. Potential roles of the identified transcripts in gravitropism are discussed. Our results suggest that the induction of asymmetrical transcription, likely as a consequence of gravitropic reorientation, precedes gravitropic bending in the rice shoot base.

  10. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  11. A unified set-based test with adaptive filtering for gene-environment interaction analyses.

    Science.gov (United States)

    Liu, Qianying; Chen, Lin S; Nicolae, Dan L; Pierce, Brandon L

    2016-06-01

    In genome-wide gene-environment interaction (GxE) studies, a common strategy to improve power is to first conduct a filtering test and retain only the SNPs that pass the filtering in the subsequent GxE analyses. Inspired by two-stage tests and gene-based tests in GxE analysis, we consider the general problem of jointly testing a set of parameters when only a few are truly from the alternative hypothesis and when filtering information is available. We propose a unified set-based test that simultaneously considers filtering on individual parameters and testing on the set. We derive the exact distribution and approximate the power function of the proposed unified statistic in simplified settings, and use them to adaptively calculate the optimal filtering threshold for each set. In the context of gene-based GxE analysis, we show that although the empirical power function may be affected by many factors, the optimal filtering threshold corresponding to the peak of the power curve primarily depends on the size of the gene. We further propose a resampling algorithm to calculate P-values for each gene given the estimated optimal filtering threshold. The performance of the method is evaluated in simulation studies and illustrated via a genome-wide gene-gender interaction analysis using pancreatic cancer genome-wide association data. PMID:26496228

  12. Textbook Questions in Context-Based and Traditional Chemistry Curricula Analysed from a Content Perspective and a Learning Activities Perspective

    Science.gov (United States)

    Overman, Michelle; Vermunt, Jan D.; Meijer, Paulien C.; Bulte, Astrid M. W.; Brekelmans, Mieke

    2013-01-01

    In this study, questions in context-based and traditional chemistry textbooks were analysed from two perspectives that are at the heart of chemistry curricula reforms: a content perspective and a learning activities perspective. To analyse these textbook questions, we developed an instrument for each perspective. In total, 971 textbook questions…

  13. Validation of a fully autonomous phosphate analyser based on a microfluidic lab-on-a-chip

    DEFF Research Database (Denmark)

    Slater, Conor; Cleary, J.; Lau, K.T.;

    2010-01-01

    This work describes the design of a phosphate analyser that utilises a microfluidic lab-on-a-chip. The analyser contains all the required chemical storage, pumping and electronic components to carry out a complete phosphate assay. The system is self-calibrating and self-cleaning, thus capable of...... long-term operation. This was proven by a bench top calibration of the analyser using standard solutions and also by comparing the analyser's performance to a commercially available phosphate monitor installed at a waste water treatment plant. The output of the microfluidic lab-on-a-chip analyser was...

  14. Bacterial regulon modeling and prediction based on systematic cis regulatory motif analyses

    Science.gov (United States)

    Liu, Bingqiang; Zhou, Chuan; Li, Guojun; Zhang, Hanyuan; Zeng, Erliang; Liu, Qi; Ma, Qin

    2016-03-01

    Regulons are the basic units of the response system in a bacterial cell, and each consists of a set of transcriptionally co-regulated operons. Regulon elucidation is the basis for studying the bacterial global transcriptional regulation network. In this study, we designed a novel co-regulation score between a pair of operons based on accurate operon identification and cis regulatory motif analyses, which can capture their co-regulation relationship much better than other scores. Taking full advantage of this discovery, we developed a new computational framework and built a novel graph model for regulon prediction. This model integrates the motif comparison and clustering and makes the regulon prediction problem substantially more solvable and accurate. To evaluate our prediction, a regulon coverage score was designed based on the documented regulons and their overlap with our prediction; and a modified Fisher Exact test was implemented to measure how well our predictions match the co-expressed modules derived from E. coli microarray gene-expression datasets collected under 466 conditions. The results indicate that our program consistently performed better than others in terms of the prediction accuracy. This suggests that our algorithms substantially improve the state-of-the-art, leading to a computational capability to reliably predict regulons for any bacteria.

  15. Risk-based analyses in support of California hazardous site remediation

    International Nuclear Information System (INIS)

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year's activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs' capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis

  16. Comprehensive logic based analyses of Toll-like receptor 4 signal transduction pathway.

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar Padwal

    Full Text Available Among the 13 TLRs in the vertebrate systems, only TLR4 utilizes both Myeloid differentiation factor 88 (MyD88 and Toll/Interleukin-1 receptor (TIR-domain-containing adapter interferon-β-inducing Factor (TRIF adaptors to transduce signals triggering host-protective immune responses. Earlier studies on the pathway combined various experimental data in the form of one comprehensive map of TLR signaling. But in the absence of adequate kinetic parameters quantitative mathematical models that reveal emerging systems level properties and dynamic inter-regulation among the kinases/phosphatases of the TLR4 network are not yet available. So, here we used reaction stoichiometry-based and parameter independent logical modeling formalism to build the TLR4 signaling network model that captured the feedback regulations, interdependencies between signaling kinases and phosphatases and the outcome of simulated infections. The analyses of the TLR4 signaling network revealed 360 feedback loops, 157 negative and 203 positive; of which, 334 loops had the phosphatase PP1 as an essential component. The network elements' interdependency (positive or negative dependencies in perturbation conditions such as the phosphatase knockout conditions revealed interdependencies between the dual-specific phosphatases MKP-1 and MKP-3 and the kinases in MAPK modules and the role of PP2A in the auto-regulation of Calmodulin kinase-II. Our simulations under the specific kinase or phosphatase gene-deficiency or inhibition conditions corroborated with several previously reported experimental data. The simulations to mimic Yersinia pestis and E. coli infections identified the key perturbation in the network and potential drug targets. Thus, our analyses of TLR4 signaling highlights the role of phosphatases as key regulatory factors in determining the global interdependencies among the network elements; uncovers novel signaling connections; identifies potential drug targets for

  17. Linking Proxy-Based and Datum-Based Shorelines on a High-Energy Coastline: Implications for Shoreline Change Analyses

    Science.gov (United States)

    Ruggiero, P.; Kaminsky, G.M.; Gelfenbaum, G.

    2003-01-01

    A primary purpose of this paper is to quantitatively link variously defined and derived shoreline estimates commonly used for shoreline change analysis. Estimates of shoreline mapping and derivation error, natural shoreline variability, and the relationships between horizontally-derived (proxy-based) shorelines to vertical datums (e.g. MHW) are presented. A series of shoreline repeatability and variability experiments as well as data from a beach monitoring program along the high-energy US Pacific Northwest coast, indicate total uncertainty estimates of the horizontal position of proxy-based shorelines to be approximately ?? 50-150 m for T-sheets and aerial photography and approximately ?? 15 m for datum-based shorelines derived from ground- or air-based topographic surveys. The ability to obtain reliable shoreline change results depends upon both the selected shoreline definition (e.g. horizontal- or feature-based proxy, or datum-based intercept) and the accuracy of the technique used in mapping or interpreting its position. The position of the selected shoreline on the beach profile determines its inherent temporal and spatial variability, an important consideration that has often been overlooked in the scientific literature on shoreline change, Historical shorelines mapped on NOS T-sheets and aerial photos have commonly identified high water line (HWL)-type shorelines, which are shown to be higher on the beach surface than the MHW-datum intercept along coasts subject to wave runup. Analyses of 4.5 years of beach profile data from the southwest Washington coast suggest that both the MHW and HWL-type shorelines have greater natural short-term variability than expected, significantly greater than the variability of shoreline proxies defined farther landward and higher on the beach profile. A model for determining the natural variability of HWL-type shorelines reveals that this short-term variability is the dominant factor in the large total uncertainty values

  18. Analysing breast cancer microarrays from African Americans using shrinkage-based discriminant analysis

    Directory of Open Access Journals (Sweden)

    Pang Herbert

    2010-10-01

    Full Text Available Abstract Breast cancer tumours among African Americans are usually more aggressive than those found in Caucasian populations. African-American patients with breast cancer also have higher mortality rates than Caucasian women. A better understanding of the disease aetiology of these breast cancers can help to improve and develop new methods for cancer prevention, diagnosis and treatment. The main goal of this project was to identify genes that help differentiate between oestrogen receptor-positive and -negative samples among a small group of African-American patients with breast cancer. Breast cancer microarrays from one of the largest genomic consortiums were analysed using 13 African-American and 201 Caucasian samples with oestrogen receptor status. We used a shrinkage-based classification method to identify genes that were informative in discriminating between oestrogen receptor-positive and -negative samples. Subset analysis and permutation were performed to obtain a set of genes unique to the African-American population. We identified a set of 156 probe sets, which gave a misclassification rate of 0.16 in distinguishing between oestrogen receptor-positive and -negative patients. The biological relevance of our findings was explored through literature-mining techniques and pathway mapping. An independent dataset was used to validate our findings and we found that the top ten genes mapped onto this dataset gave a misclassification rate of 0.15. The described method allows us best to utilise the information available from small sample size microarray data in the context of ethnic minorities.

  19. Ecology of Subglacial Lake Vostok (Antarctica, Based on Metagenomic/Metatranscriptomic Analyses of Accretion Ice

    Directory of Open Access Journals (Sweden)

    Tom D'Elia

    2013-03-01

    Full Text Available Lake Vostok is the largest of the nearly 400 subglacial Antarctic lakes and has been continuously buried by glacial ice for 15 million years. Extreme cold, heat (from possible hydrothermal activity, pressure (from the overriding glacier and dissolved oxygen (delivered by melting meteoric ice, in addition to limited nutrients and complete darkness, combine to produce one of the most extreme environments on Earth. Metagenomic/metatranscriptomic analyses of ice that accreted over a shallow embayment and over the southern main lake basin indicate the presence of thousands of species of organisms (94% Bacteria, 6% Eukarya, and two Archaea. The predominant bacterial sequences were closest to those from species of Firmicutes, Proteobacteria and Actinobacteria, while the predominant eukaryotic sequences were most similar to those from species of ascomycetous and basidiomycetous Fungi. Based on the sequence data, the lake appears to contain a mixture of autotrophs and heterotrophs capable of performing nitrogen fixation, nitrogen cycling, carbon fixation and nutrient recycling. Sequences closest to those of psychrophiles and thermophiles indicate a cold lake with possible hydrothermal activity. Sequences most similar to those from marine and aquatic species suggest the presence of marine and freshwater regions.

  20. Phylogenetic analyses of some genera in Oedipodidae (Orthoptera: Acridoidea) based on 16S mitochondrial partialgene sequences

    Institute of Scientific and Technical Information of China (English)

    Xiang-Chu Yin; Xin-Jiang Li; Wen-Qiang Wang; Hong Yin; Cheng-Quan Cao; Bao-Hua Ye; Zhan Yin

    2008-01-01

    Based on the 16S mitochondrial partial gene sequences of 29 genera, containing 26 from Oedipodidae and one each from Tanaoceridae, Pyrgomorphidae and Tetrigidae (as outgroups), the homologus sequences were compared and phylogenetic analyses were performed. A phylogenetic tree was inferred by neighbor-joining (N J). The results of sequences compared show that: (i) in a total of 574 bp of Oedipodidae, the number of substituted nucleotides was 265 bp and the average percentages ofT, C, A and G were 38.3%,11.4%, 31.8% and 18.5%, respectively, and the content of A+T (70.1%) was distinctly richer than that of C+G (29.9%); and (ii) the average nucleotide divergence of 16S rDNA sequences among genera of Oedipodidae were 9.0%, among families of Acridoidea were 17.0%, and between superfamilies (Tetrigoidea and Acridoidea) were 23.9%, respectively. The phylogenetic tree indicated: (i) the Oedipodidae was a monophyletic group, which suggested that the taxonomic status of this family was confLrrned; (ii) the genus Heteropternis separated from the other Oedipodids first and had another unique sound-producing structure in morphology, which is the type-genus of subfamily Heteropterninae; and (iii) the relative intergeneric relationship within the same continent was closer than that of different continents, and between the Eurasian genera and the African genera, was closer than that between Eurasians and Americans.

  1. Historical Weathering Based on Chemical Analyses of Two Spodosols in Southern Sweden

    International Nuclear Information System (INIS)

    Chemical weathering losses were calculated for two conifer stands in relation to ongoing studies on liming effects and ash amendments on chemical status, soil solution chemistry and soil genesis. Weathering losses were based on elemental depletion trends in soil profiles since deglaciation and exposure to the weathering environment. Gradients in total geochemical composition were assumed to reflect alteration over time. Study sites were Horroed and Hassloev in southern Sweden. Both Horroed and Hassloev sites are located on sandy loamy Weichselian till at an altitude of 85 and 190 m a.s.l., respectively. Aliquots from volume determined samples from a number of soil levels were fused with lithium metaborate, dissolved in HNO3, and analysed by ICP - AES. Results indicated highest cumulative weathering losses at Hassloev. The weathering losses for the elements are in the following order:Si > Al > K > Na > Ca > MgTotal annual losses for Ca+Mg+K+Na, expressed in mmolc m-2 yr-1, amounted to c. 28 and 58 at Horroed and Hassloev, respectively. Variations between study sites could not be explained by differences in bulk density, geochemistry or mineralogy. The accumulated weathering losses since deglaciation were larger in the uppermost 15 cm than in deeper B horizons for most elements studied

  2. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-07-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940

  3. Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook

    LENUS (Irish Health Repository)

    Lonergan, Peter E

    2011-09-25

    Abstract Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http:\\/\\/www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes.

  4. Deciphering Clostridium tyrobutyricum Metabolism Based on the Whole-Genome Sequence and Proteome Analyses

    Science.gov (United States)

    Lee, Joungmin; Jang, Yu-Sin; Han, Mee-Jung; Kim, Jin Young

    2016-01-01

    ABSTRACT Clostridium tyrobutyricum is a Gram-positive anaerobic bacterium that efficiently produces butyric acid and is considered a promising host for anaerobic production of bulk chemicals. Due to limited knowledge on the genetic and metabolic characteristics of this strain, however, little progress has been made in metabolic engineering of this strain. Here we report the complete genome sequence of C. tyrobutyricum KCTC 5387 (ATCC 25755), which consists of a 3.07-Mbp chromosome and a 63-kbp plasmid. The results of genomic analyses suggested that C. tyrobutyricum produces butyrate from butyryl-coenzyme A (butyryl-CoA) through acetate reassimilation by CoA transferase, differently from Clostridium acetobutylicum, which uses the phosphotransbutyrylase-butyrate kinase pathway; this was validated by reverse transcription-PCR (RT-PCR) of related genes, protein expression levels, in vitro CoA transferase assay, and fed-batch fermentation. In addition, the changes in protein expression levels during the course of batch fermentations on glucose were examined by shotgun proteomics. Unlike C. acetobutylicum, the expression levels of proteins involved in glycolytic and fermentative pathways in C. tyrobutyricum did not decrease even at the stationary phase. Proteins related to energy conservation mechanisms, including Rnf complex, NfnAB, and pyruvate-phosphate dikinase that are absent in C. acetobutylicum, were identified. Such features explain why this organism can produce butyric acid to a much higher titer and better tolerate toxic metabolites. This study presenting the complete genome sequence, global protein expression profiles, and genome-based metabolic characteristics during the batch fermentation of C. tyrobutyricum will be valuable in designing strategies for metabolic engineering of this strain. PMID:27302759

  5. Application of Rank(S-r), a maturity index based on chemical analyses of coals

    Energy Technology Data Exchange (ETDEWEB)

    Suggate, R.P. [Inst. for Geology & Nuclear Science, Lower Hutt (New Zealand)

    2002-09-01

    The Rank{sub S} classification of Suggate, which uses data from standard coal industry chemical analyses, is illustrated on interdependent diagrams with axes of atomic O/C and H/C and of calorific value and volatile matter. The Rank{sub S-r} scale, which is compensated for coal type, is linear with depth under conditions of linear geothermal gradients, so that the depth value for each unit increase in rank depends on the geothermal gradient at the time of maximum temperature. A general relation is established between Rank(Sr) and the temperature of attainment of rank: Temp.{sup o}C = 10 x Rank{sub S-r} + 15. Significant oil generation begins at Rank({sub S-r}) 9-10 and expulsion at Rank{sub S-r} 11.5-12.5. A clear general relation exists between Rank{sub S-r} and vitrinite reflectance, but Rank{sub S-r} is somewhat more accurate over the range from peat to the end of the oil window. In the Paleogene Buller Coalfield, New Zealand, the use of Rank{sub S-r} is an aid to interpeting geological history. Coals in the Carboniferous Nottinghamshire-Yorkshire Coalfield in England probably contributed hydrocarbons, including oil, to the adjoining East Midlands hydrocarbon fields. Vertical and lateral variations in Rank{sub S-r} in the coalfield, and the Rank{sub S-r} values in coal measures at the base of the Permian, require a thick Mesozoic-Tertiary cover. In the oilfields of the Mahakam Delta, Indonesia, and the Taranaki Basin, New Zealand, Rank{sub S-r}/depth gradients imply surface values that are close to zero where little or no cover has been eroded from above the wells, and inferred temperatures that are reasonably close to present temperatures.

  6. Molecular Characterization of Five Potyviruses Infecting Korean Sweet Potatoes Based on Analyses of Complete Genome Sequences

    Directory of Open Access Journals (Sweden)

    Hae-Ryun Kwak

    2015-12-01

    Full Text Available Sweet potatoes (Ipomea batatas L. are grown extensively, in tropical and temperate regions, and are important food crops worldwide. In Korea, potyviruses, including Sweet potato feathery mottle virus (SPFMV, Sweet potato virus C (SPVC, Sweet potato virus G (SPVG, Sweet potato virus 2 (SPV2, and Sweet potato latent virus (SPLV, have been detected in sweet potato fields at a high (~95% incidence. In the present work, complete genome sequences of 18 isolates, representing the five potyviruses mentioned above, were compared with previously reported genome sequences. The complete genomes consisted of 10,081 to 10,830 nucleotides, excluding the poly-A tails. Their genomic organizations were typical of the Potyvirus genus, including one target open reading frame coding for a putative polyprotein. Based on phylogenetic analyses and sequence comparisons, the Korean SPFMV isolates belonged to the strains RC and O with >98% nucleotide sequence identity. Korean SPVC isolates had 99% identity to the Japanese isolate SPVC-Bungo and 70% identity to the SPFMV isolates. The Korean SPVG isolates showed 99% identity to the three previously reported SPVG isolates. Korean SPV2 isolates had 97% identity to the SPV2 GWB-2 isolate from the USA. Korean SPLV isolates had a relatively low (88% nucleotide sequence identity with the Taiwanese SPLV-TW isolates, and they were phylogenetically distantly related to SPFMV isolates. Recombination analysis revealed that possible recombination events occurred in the P1, HC-Pro and NIa-NIb regions of SPFMV and SPLV isolates and these regions were identified as hotspots for recombination in the sweet potato potyviruses.

  7. Fatigue analyses of the prototype Francis runners based on site measurements and simulations

    International Nuclear Information System (INIS)

    With the increasing development of solar power and wind power which give an unstable output to the electrical grid, hydropower is required to give a rapid and flexible compensation, and the hydraulic turbines have to operate at off-design conditions frequently. Prototype Francis runners suffer from strong vibrations induced by high pressure pulsations at part load, low part load, speed-no-load and during start-stops and load rejections. Fatigue and damage may be caused by the alternating stress on the runner blades. Therefore, it becomes increasingly important to carry out fatigue analysis and life time assessment of the prototype Francis runners, especially at off-design conditions. This paper presents the fatigue analyses of the prototype Francis runners based on the strain gauge site measurements and numerical simulations. In the case of low part load, speed-no-load and transient events, since the Francis runners are subjected to complex hydraulic loading, which shows a stochastic characteristic, the rainflow counting method is used to obtain the number of cycles for various dynamic amplitude ranges. From middle load to full load, pressure pulsations caused by Rotor-stator- Interaction become the dominant hydraulic excitation of the runners. Forced response analysis is performed to calculate the maximum dynamic stress. The agreement between numerical and experimental stresses is evaluated using linear regression method. Taking into account the effect of the static stress on the S-N curve, the Miner's rule, a linear cumulative fatigue damage theory, is employed to calculate the damage factors of the prototype Francis runners at various operating conditions. The relative damage factors of the runners at different operating points are compared and discussed in detail

  8. Fatigue analyses of the prototype Francis runners based on site measurements and simulations

    Science.gov (United States)

    Huang, X.; Chamberland-Lauzon, J.; Oram, C.; Klopfer, A.; Ruchonnet, N.

    2014-03-01

    With the increasing development of solar power and wind power which give an unstable output to the electrical grid, hydropower is required to give a rapid and flexible compensation, and the hydraulic turbines have to operate at off-design conditions frequently. Prototype Francis runners suffer from strong vibrations induced by high pressure pulsations at part load, low part load, speed-no-load and during start-stops and load rejections. Fatigue and damage may be caused by the alternating stress on the runner blades. Therefore, it becomes increasingly important to carry out fatigue analysis and life time assessment of the prototype Francis runners, especially at off-design conditions. This paper presents the fatigue analyses of the prototype Francis runners based on the strain gauge site measurements and numerical simulations. In the case of low part load, speed-no-load and transient events, since the Francis runners are subjected to complex hydraulic loading, which shows a stochastic characteristic, the rainflow counting method is used to obtain the number of cycles for various dynamic amplitude ranges. From middle load to full load, pressure pulsations caused by Rotor-stator- Interaction become the dominant hydraulic excitation of the runners. Forced response analysis is performed to calculate the maximum dynamic stress. The agreement between numerical and experimental stresses is evaluated using linear regression method. Taking into account the effect of the static stress on the S-N curve, the Miner's rule, a linear cumulative fatigue damage theory, is employed to calculate the damage factors of the prototype Francis runners at various operating conditions. The relative damage factors of the runners at different operating points are compared and discussed in detail.

  9. IMPROVING CONTROL ROOM DESIGN AND OPERATIONS BASED ON HUMAN FACTORS ANALYSES OR HOW MUCH HUMAN FACTORS UPGRADE IS ENOUGH?

    International Nuclear Information System (INIS)

    THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT

  10. Comparing Surface-Based and Volume-Based Analyses of Functional Neuroimaging Data in Patients with Schizophrenia

    Science.gov (United States)

    Anticevic, Alan; Dierker, Donna L.; Gillespie, Sarah K.; Repovs, Grega; Csernansky, John G.; Van Essen, David C.; Barch, Deanna M.

    2008-01-01

    A major challenge in functional neuroimaging is to cope with individual variability in cortical structure and function. Most analyses of cortical function compensate for variability using affine or low-dimensional nonlinear volume-based registration (VBR) of individual subjects to an atlas, which does not explicitly take into account the geometry of cortical convolutions. A promising alternative is to use surface-based registration (SBR), which capitalizes on explicit surface representations of cortical folding patterns in individual subjects. In this study, we directly compare results from SBR and affine VBR in a study of working memory in healthy controls and patients with schizophrenia (SCZ). Each subject's structural scan was used for cortical surface reconstruction using the SureFit method. fMRI data were mapped directly onto individual cortical surface models, and each hemisphere was registered to the population-average PALS-B12 atlas using landmark-constrained SBR. The precision with which cortical sulci were aligned was much greater for SBR than VBR. SBR produced superior alignment precision across the entire cortex, and this benefit was greater in patients with schizophrenia. We demonstrate that spatial smoothing on the surface provides better resolution and signal preservation than a comparable degree of smoothing in the volume domain. Lastly, the statistical power of functional activation in the working memory task was greater for SBR than for VBR. These results indicate that SBR provides significant advantages over affine VBR when analyzing cortical fMRI activations. Furthermore, these improvements can be even greater in disorders that have associated structural abnormalities. PMID:18434199

  11. Analysing a complex agent-based model using data-mining techniques

    OpenAIRE

    Edmonds, Bruce; Little, Claire; Lessard-Phillips, Laurence; Fieldhouse, Ed

    2014-01-01

    A complex "Data Integration Model" of voter behaviour is described. However it is very complex and hard to analyse. For such a model "thin" samples of the outcomes using classic parameter sweeps are inadequate. In order to get a more holistic picture of its behaviour datamining techniques are applied to the data generated by many runs of the model, each with randomised parameter values.

  12. Systematics of Plant-Pathogenic and Related Streptomyces Species Based on Phylogenetic Analyses of Multiple Gene Loci

    Science.gov (United States)

    The 10 species of Streptomyces implicated as the etiological agents in scab disease of potatoes or soft rot disease of sweet potatoes are distributed among 7 different phylogenetic clades in analyses based on 16S rRNA gene sequences, but high sequence similarity of this gene among Streptomyces speci...

  13. Analysing the factors that influence tag choice based on semiotic analysis and activity theory

    OpenAIRE

    Elhussain, Mariam; Nakata, Keiichi

    2012-01-01

    Social tagging has become very popular around the Internet as well as in research. The main idea behind tagging is to allow users to provide metadata to the web content from their perspective to facilitate categorization and retrieval. There are many factors that influence users' tag choice. Many studies have been conducted to reveal these factors by analysing tagging data. This paper uses two theories to identify these factors, namely the semiotics theory and activity theory. The former trea...

  14. A (not so) dangerous method: pXRF vs. EPMA-WDS analyses of copper-based artefacts

    OpenAIRE

    Orfanou, V.; Rehren, T.

    2014-01-01

    Analysis of metal objects with portable and handheld X-ray fluorescence spectrometry has become increasingly popular in recent years. Here, methodological concerns that apply to non-destructive, surface examination with XRF instruments of ancient metal artefacts are discussed based on the comparative analyses of a set of copper-based objects by means of portable X-ray fluorescence (pXRF) and electron probe microanalyser (EPMA). The analytical investigation aims to explore issues of instrument...

  15. Subjective Outcome Evaluation Based on Secondary Data Analyses: The Project P.A.T.H.S. in Hong Kong

    OpenAIRE

    Shek, Daniel T. L.; Sun, Rachel C. F.

    2010-01-01

    The intent of this study was to evaluate the program effectiveness of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) (Secondary 1 Curriculum) by analyzing 207 school-based program reports, in which program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were conducted and 1,855 meaningful ...

  16. Implementation Quality of a Positive Youth Development Program: Cross-Case Analyses Based on Seven Cases in Hong Kong

    OpenAIRE

    Shek, Daniel T. L.; Sun, Rachel C. F.

    2008-01-01

    Cross-case analyses of factors that influence the process and implementation quality of the Tier 1 Program of the Project P.A.T.H.S. based on seven cases were carried out. Systematic and integrative analyses revealed several conclusions. First, several factors related to policy, people, program, process, and place (5 “P”s) were conducive to the successful implementation of the Tier 1 Program in the schools. Second, there were obstacles and difficulties with reference to the 5 “P”s that impede...

  17. Numerical methods of data processing based on singular spectral and metric analyses, and their applications

    International Nuclear Information System (INIS)

    The author has developed new methods and software for functions interpolation based on the metric analysis, and has created new methods and software for recovering the functions of one and several variables, also based on the metric analysis. A new integrated circuit and software for hidden anomalies allocation in the chaotic time processes, based on singular spectral and wavelet analyzes have also been created

  18. The comparison of classification analyses on the public opinion focus of biofuels based on twitter

    Science.gov (United States)

    Wu, Shianghau; Guo, Jiannjong

    2016-02-01

    The study combined with the text mining and classification methodology to investigate the biofuels related tweets on the social network so as to understand the general public opinion focus. The contribution of the study enclosed the subsequent two points. First, the study utilized the text mining method to explore the content of biofuels connected tweets on the famed social network "twitter" and found the main keywords consistent with their frequencies. Second, the study applied the Back Propagation Neural Network (BPN), random forests model and two sorts of hybrid algorithmclassification analyses to compare the classification results.

  19. Benchmark analyses of criticality calculation codes based on the evaluated dissolver-type criticality experiment systems

    International Nuclear Information System (INIS)

    Criticality calculation codes/code systems MCNP, MVP, SCALE and JACS, which are currently typically used in Japan for nuclear criticality safety evaluation, were benchmarked for so called dissolver-typed systems, i.e., fuel rod arrays immersed in fuel solution. The benchmark analyses were made for the evaluated critical experiments published in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook: one evaluation representing five critical configurations from heterogeneous core of low-enriched uranium dioxides at the Japan Atomic Energy Research Institute and two evaluations representing 16 critical configurations from heterogeneous core of mixed uranium and plutonium dioxides (MOXs) at the Battelle Pacific Northwest Laboratories of the U.S.A. The results of the analyses showed that the minimum values of the neutron multiplication factor obtained with MCNP, MVP, SCALE and JACS were 0.993, 0.990, 0.993, 0.972, respectively, which values are from 2% to 4% larger than the maximum permissible multiplication factor of 0.95. (author)

  20. Analyse - technologies; Analyse - technologies

    Energy Technology Data Exchange (ETDEWEB)

    Roudil, D.; Chevalier, M.; Cormont, Ph.; Viala, F.; Kopp, Ch.; Peillet, O.; Chatroux, D.; Lausenaz, Y.; Villard, J.F.; Bruel, L.; Berhouet, F.; Chartier, F.; Aubert, M.; Blanchet, P.; Steiner, F.; Puech, M.H.; Bienvenu, Ph.; Noire, M.H.; Bouzon, C.; Schrive, L

    1999-07-01

    In this chapter of the DCC 1999 scientific report, the following theoretical studies are detailed: emulsions characterization by ultrasonics, high resolution wavelength meter, optimization methodology for diffractive and hybrid optic system, reliability for fast switches in power electronics, study of cesium isolation in irradiated fuels, chemical optodes based on evanescent wave absorption, radionuclides (Zirconium 93 and molybdenum 93) determination in irradiated fuels processing effluents, study of viscous liquid ultrafiltration using supercritical CO{sub 2} fluid. (A.L.B.)

  1. Activity Based Learning in a Freshman Global Business Course: Analyses of Preferences and Demographic Differences

    Science.gov (United States)

    Levine, Mark F.; Guy, Paul W.

    2007-01-01

    The present study investigates pre-business students' reaction to Activity Based Learning in a lower division core required course entitled Introduction to Global Business in the business curriculum at California State University Chico. The study investigates students' preference for Activity Based Learning in comparison to a more traditional…

  2. Scaling and Spectral Analyses Based on Spatial Correlation Functions of Urban Form

    CERN Document Server

    Chen, Yanguang

    2012-01-01

    Urban form has been empirically demonstrated to be of scaling invariance and can be described with fractal geometry. However, the valid range of fractal dimension and the relationships between various fractal indicators of cities are not yet revealed in theory. Especially, systematic methods of spatial analysis have not yet been developed for fractal cities. By mathematical deduction and transformation (e.g. Fourier transform), I find that scaling analysis, spectral analysis, and spatial correlation analysis are all associated with fractal concepts and can be integrated into a new approach to fractal analysis of cities. This method can be termed '3S analyses' of urban form. Using the 3S analysis, I derived a set of fractal parameter equations, by which different fractal parameters of cities can be linked up with one another. Each fractal parameter has its own reasonable range of values. According to the fractal parameter equations, the intersection of the rational ranges of different fractal parameters sugges...

  3. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. PMID:26069219

  4. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    Directory of Open Access Journals (Sweden)

    Peixin Zhao

    2014-01-01

    Full Text Available This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area.

  5. Comparison between CARIBIC Aerosol Samples Analysed by Accelerator-Based Methods and Optical Particle Counter Measurements

    OpenAIRE

    B. G. Martinsson; J. Friberg; Andersson, S M; Weigelt, A; Hermann, M.; D. Assmann; J. Voigtländer; C. A. M. Brenninkmeijer; Velthoven, P. J. F.; Zahn, A.

    2014-01-01

    Inter-comparison of results from two kinds of aerosol systems in the CARIBIC (Civil Aircraft for the Regular Investigation of the atmosphere Based on a Instrument Container) passenger aircraft based observatory, operating during intercontinental flights at 9–12 km altitude, is presented. Aerosol from the lowermost stratosphere (LMS), the extra-tropical upper troposphere (UT) and the tropical mid troposphere (MT) were investigated. Aerosol particle volume concentration measur...

  6. Exploring the World of Agent-Based Simulations: Simple Models, Complex Analyses

    OpenAIRE

    Sanchez, Susan M.; Lucas, Thomas W.

    2002-01-01

    Proceedings of the 2002 Winter Simulation Conference E. Yücesan, C.-H. Chen, J. L. Snowdon, and J. M. Charnes, eds. Agent-based simulations are models where multiple entities sense and stochastically respond to conditions in their local environments, mimicking complex large-scale system behavior. We provide an overview of some important issues in the modeling and analysis of agent-based systems. Examples are drawn from a range of fields: biological modeling, sociologic...

  7. Simultaneous detection of eight immunosuppressive chicken viruses using a GeXP analyser-based multiplex PCR assay

    OpenAIRE

    Zeng, Tingting; Xie, Zhixun; Xie, Liji; Deng, Xianwen; Xie, Zhiqin; Luo, Sisi; Huang, Li; Huang, Jiaoling

    2015-01-01

    Background Immunosuppressive viruses are frequently found as co-infections in the chicken industry, potentially causing serious economic losses. Because traditional molecular biology methods have limited detection ability, a rapid, high-throughput method for the differential diagnosis of these viruses is needed. The objective of this study is to develop a GenomeLab Gene Expression Profiler Analyser-based multiplex PCR method (GeXP-multiplex PCR) for simultaneous detection of eight immunosuppr...

  8. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  9. Regional scale synoptic air monitoring for visibility evaluation based on PIXE analyses

    International Nuclear Information System (INIS)

    The western part of the United States is characterized both by major scenic attractions (Grand Canyon, Yellowstone National Park, etc.) and the excellent visibilities necessary to enjoy the vistas. Recent legislation has been enacted to protect such visibilities from degradation associated with energy and resource development, and the U.S. Environmental Protection Agency has been charged with establishing baseline values for both visibility and fine particles. For this purpose, a network of forty air sampling stations has been established in remote locations of the study area (representing about 25% of the contiguous forty-eight states) on a 150 km grid spacing. Each sampler collects coarse and fine (less than 2.5 μm diameter) particles through sequential filtration over two three-day periods per week. Four central stations are operated with a larger variety of meteorological and air sampling instruments, as well as visibility probes, allowing daily samples in numerous size ranges. Many of these instruments were designed around the capabilities of PIXE, resulting in highly quantitative data at major savings in cost for such a large array. Particulate sources are being evaluated through elemental tracer and meteorological trajectory analyses, and the effects on visibility are being studied through statistical methodologies. (orig.)

  10. Comparative UPLC-QTOF-MS-based metabolomics and bioactivities analyses of Garcinia oblongifolia.

    Science.gov (United States)

    Li, Ping; AnandhiSenthilkumar, Harini; Wu, Shi-biao; Liu, Bo; Guo, Zhi-yong; Fata, Jimmie E; Kennelly, Edward J; Long, Chun-lin

    2016-02-01

    Garcinia oblongifolia Champ. ex Benth. (Clusiaceae) is a well-known medicinal plant from southern China, with edible fruits. However, the phytochemistry and bioactivity of the different plant parts of G. oblongifolia have not been studied extensively. Comparative metabolic profiling and bioactivities of the leaf, branch, and fruit of G. oblongifolia were investigated. A total of 40 compounds such as biflavonoids, xanthones, and benzophenones were identified using UPLC-QTOF-MS and MS(E), including 15 compounds reported for the first time from this species. Heatmap analyses found that benzophenones, xanthones, and biflavonoids were predominately found in branches, with benzophenones present in relatively high concentrations in all three plant parts. Xanthones were found to have limited distribution in fruit while biflavonoids were present at only low levels in leaves. In addition, the cytotoxic (MCF-7 breast cancer cell line) and antioxidant (ABTS and DPPH chemical tests) activities of the crude extracts of G. oblongifolia indicate that the branch extract exhibits greater bioactivity than either the leaf or the fruit extracts. Orthogonal partial least squares discriminate analysis was used to find 12 marker compounds, mainly xanthones, from the branches, including well-known antioxidants and cytotoxic agents. These G. oblongifolia results revealed that the variation in metabolite profiles can be correlated to the differences in bioactivity of the three plant parts investigated. This UPLC-QTOF-MS strategy can be useful to identify bioactive constituents expressed differentially in the various plant parts of a single species. PMID:26773895

  11. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  12. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  13. Agent-based modelling as scientific method: a case study analysing primate social behaviour.

    Science.gov (United States)

    Bryson, Joanna J; Ando, Yasushi; Lehmann, Hagen

    2007-09-29

    A scientific methodology in general should provide two things: first, a means of explanation and, second, a mechanism for improving that explanation. Agent-based modelling (ABM) is a method that facilitates exploring the collective effects of individual action selection. The explanatory force of the model is the extent to which an observed meta-level phenomenon can be accounted for by the behaviour of its micro-level actors. This article demonstrates that this methodology can be applied to the biological sciences; agent-based models, like any other scientific hypotheses, can be tested, critiqued, generalized or specified. We review the state of the art for ABM as a methodology for biology and then present a case study based on the most widely published agent-based model in the biological sciences: Hemelrijk's DomWorld, a model of primate social behaviour. Our analysis shows some significant discrepancies between this model and the behaviour of the macaques, the genus used for our analysis. We also demonstrate that the model is not fragile: its other results are still valid and can be extended to compensate for these problems. This robustness is a standard advantage of experiment-based artificial intelligence modelling techniques over analytic modelling. PMID:17434852

  14. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    Science.gov (United States)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  15. Variability Abstractions: Trading Precision for Speed in Family-Based Analyses

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2015-01-01

    transformation that translates any SPL program into an abstracted version of it, such that the analysis of the abstracted SPL coincides with the corresponding abstracted analysis of the original SPL. We implement the transformation in a tool, that works on Object-Oriented Java program families, and evaluate the......Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is capable of analyzing all valid products (variants) without generating any of them explicitly. It takes as input only the common code base, which encodes all variants of a SPL, and produces analysis results corresponding...

  16. G-BASE data conditioning procedures for stream sediment and soil chemical analyses

    OpenAIRE

    T. R. Lister; Johnson, C C

    2005-01-01

    Data conditioning is the process of making data fit for the purpose for which it is to be used and forms a significant component of the G-BASE project. This report is part of a series of manuals to record G-BASE project methodology. For data conditioning this has been difficult as applications used for processing data and the way in which data are reported continue to evolve rapidly and sections of this report have had to be continually updated to reflect this fact. However, the principals of...

  17. Legal Office Procedures: Task Analyses. Competency-Based Education. Review Draft.

    Science.gov (United States)

    Virginia Polytechnic Inst. and State Univ., Blacksburg.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in a course on legal office procedures. Section 1 contains a validated task inventory for legal office procedures. For each task, applicable information pertaining to performance and enabling objectives,…

  18. AMPHIDINIUM REVISITED. I. REDEFINITION OF AMPHIDINIUM (DINOPHYCEAE) BASED ON CLADISTIC AND MOLECULAR PHYLOGENETIC ANALYSES

    DEFF Research Database (Denmark)

    Jørgensen, Mårten Flø; Murray, Shauna; Daugbjerg, Niels

    2004-01-01

    minute left-deflected epicones formed a monophyletic clade that included the type species. Amphidinium species with other epicone types were found to be unrelated to this clade. The type species A. operculatum was identified based on general cell shape and size, position of a dark organelle previously...

  19. Dimensionality of the Chinese Perceived Causes of Poverty Scale: Findings Based on Confirmatory Factor Analyses

    Science.gov (United States)

    Shek, Daniel T. L.; Ma, Cecilia Man-Sze

    2009-01-01

    The Chinese Perceived Causes of Poverty Scale (CPCPS) was constructed to assess Chinese people's beliefs about poverty. Four categories of explanations of poverty are covered in this scale: personal problems of poor people, lack of opportunities to escape from poverty, exploitation of poor people, and bad fate. Based on the responses of 1,519…

  20. Analysing a Web-Based E-Commerce Learning Community: A Case Study in Brazil.

    Science.gov (United States)

    Joia, Luiz Antonio

    2002-01-01

    Demonstrates the use of a Web-based participative virtual learning environment for graduate students in Brazil enrolled in an electronic commerce course in a Masters in Business Administration program. Discusses learning communities; computer-supported collaborative work and collaborative learning; influences on student participation; the role of…

  1. Handbook of methods for risk-based analyses of technical specifications

    International Nuclear Information System (INIS)

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations

  2. Horticulture III, IV, and V. Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in the horticulture program. Section 1 contains a validated task inventory for horticulture III, IV, and V. For each task, applicable information pertaining to performance and enabling objectives,…

  3. Improving the safety of a body composition analyser based on the PGNAA method.

    Science.gov (United States)

    Miri-Hakimabad, Hashem; Izadi-Najafabadi, Reza; Vejdani-Noghreiyan, Alireza; Panjeh, Hamed

    2007-12-01

    The 252Cf radioisotope and 241Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. 252Cf and 241Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the 252Cf and 241Am-Be sources. PMID:18268376

  4. Improving the safety of a body composition analyser based on the PGNAA method

    International Nuclear Information System (INIS)

    The 252Cf radioisotope and 241Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. 252Cf and 241Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the 252Cf and 241Am-Be sources

  5. Improving the safety of a body composition analyser based on the PGNAA method

    Energy Technology Data Exchange (ETDEWEB)

    Miri-Hakimabad, Hashem; Izadi-Najafabadi, Reza; Vejdani-Noghreiyan, Alireza; Panjeh, Hamed [FUM Radiation Detection And Measurement Laboratory, Ferdowsi University of Mashhad (Iran, Islamic Republic of)

    2007-12-15

    The {sup 252}Cf radioisotope and {sup 241}Am-Be are intense neutron emitters that are readily encapsulated in compact, portable and sealed sources. Some features such as high flux of neutron emission and reliable neutron spectrum of these sources make them suitable for the prompt gamma neutron activation analysis (PGNAA) method. The PGNAA method can be used in medicine for neutron radiography and body chemical composition analysis. {sup 252}Cf and {sup 241}Am-Be sources generate not only neutrons but also are intense gamma emitters. Furthermore, the sample in medical treatments is a human body, so it may be exposed to the bombardments of these gamma-rays. Moreover, accumulations of these high-rate gamma-rays in the detector volume cause simultaneous pulses that can be piled up and distort the spectra in the region of interest (ROI). In order to remove these disadvantages in a practical way without being concerned about losing the thermal neutron flux, a gamma-ray filter made of Pb must be employed. The paper suggests a relatively safe body chemical composition analyser (BCCA) machine that uses a spherical Pb shield, enclosing the neutron source. Gamma-ray shielding effects and the optimum radius of the spherical Pb shield have been investigated, using the MCNP-4C code, and compared with the unfiltered case, the bare source. Finally, experimental results demonstrate that an optimised gamma-ray shield for the neutron source in a BCCA can reduce effectively the risk of exposure to the {sup 252}Cf and {sup 241}Am-Be sources.

  6. White matter disruption in moderate/severe pediatric traumatic brain injury: Advanced tract-based analyses

    Directory of Open Access Journals (Sweden)

    Emily L. Dennis

    2015-01-01

    Full Text Available Traumatic brain injury (TBI is the leading cause of death and disability in children and can lead to a wide range of impairments. Brain imaging methods such as DTI (diffusion tensor imaging are uniquely sensitive to the white matter (WM damage that is common in TBI. However, higher-level analyses using tractography are complicated by the damage and decreased FA (fractional anisotropy characteristic of TBI, which can result in premature tract endings. We used the newly developed autoMATE (automated multi-atlas tract extraction method to identify differences in WM integrity. 63 pediatric patients aged 8–19 years with moderate/severe TBI were examined with cross sectional scanning at one or two time points after injury: a post-acute assessment 1–5 months post-injury and a chronic assessment 13–19 months post-injury. A battery of cognitive function tests was performed in the same time periods. 56 children were examined in the first phase, 28 TBI patients and 28 healthy controls. In the second phase 34 children were studied, 17 TBI patients and 17 controls (27 participants completed both post-acute and chronic phases. We did not find any significant group differences in the post-acute phase. Chronically, we found extensive group differences, mainly for mean and radial diffusivity (MD and RD. In the chronic phase, we found higher MD and RD across a wide range of WM. Additionally, we found correlations between these WM integrity measures and cognitive deficits. This suggests a distributed pattern of WM disruption that continues over the first year following a TBI in children.

  7. Analysing the Relationship between Learning Styles and Navigation Behaviour in Web-Based Educational System

    Directory of Open Access Journals (Sweden)

    Nabila Bousbia

    2010-12-01

    Full Text Available The aim of our research is to automatically deduce the learning style from the analysis of browsing behaviour. To find how to deduce the learning style, we are investigating, in this paper, the relationships between the learner‟s navigation behaviour and his/her learning style in web-based learning. To explore this relation, we carried out an experiment with 27 students of computer science at the engineering school (ESI-Algeria. The students used a hypermedia course on an e-learning platform. The learners‟ navigation behaviour is evaluated using a navigation type indicator that we propose and calculate based on trace analysis. The findings are presented with regard to the learning styles measured using the Index of Learning Styles by (Felder and Solomon 1996. We conclude with a discussion of these results.

  8. Extended Distance-based Phylogenetic Analyses Applied to 3D Homo Fossil Skull Evolution

    OpenAIRE

    Waddell, Peter J.

    2014-01-01

    This article shows how 3D geometric morphometric data can be analyzed using newly developed distance-based evolutionary tree inference methods, with extensions to planar graphs. Application of these methods to 3D representations of the skullcap (calvaria) of 13 diverse skulls in the genus Homo, ranging from Homo erectus (ergaster) at about 1.6 mya, all the way forward to modern humans, yields a remarkably clear phylogenetic tree. Various evolutionary hypotheses are tested. Results of these te...

  9. PRODUCTION PROCESS DESIGN OF FUNCTIONAL FOOD PRODUCTS BASED ON FUNCTIONAL VALUE ANALYSES

    OpenAIRE

    Pershakova T. V.; Shubina L. N.; Derenkova I. A.; Naumov N. N.

    2015-01-01

    The article substantiates the feasibility of the method of functional value analysis application to ensure high efficiency for the production of functional food products. This article describes the design technique of food functionality based on the methodology of value analysis, allowing considering such factors as consumer preferences, nutritional, functional value, economic and technological indicators while developing product formulations and technologies. With the example of flour confec...

  10. Using knowledge based micro simulation in analysing the application of legislation

    OpenAIRE

    Svensson, J.S.; Wassink, J.G.J.

    1993-01-01

    The method of knowledge based micro simulation (KBMS) was developed to help determine (ex-ante) the socio-economic consequences of social security legislation. This paper discusses the possibilities of the KBMS method in the proces of ex-post evaluation of legislation, namely in monitoring the application of legislation by the (local) administrative organisations. The two possibilities that are discussed are: - the monitoring of actual compliance with legislation by the administrative organis...

  11. Optimised procedure to analyse maillard reaction-associated fluorescence in cereal-based products

    OpenAIRE

    Delgado Andrade, Cristina; Rufián Henares, J. A.; Morales, F. J.

    2008-01-01

    Fluorescent Maillard compounds measurement provides more specific information on the extent of the Maillard reaction than other unspecific tools to monitor the reaction, and is suitable, as the first approach, to assess the nutritional quality of foods as related to protein damage. This work presents an optimised laboratory procedure for the measurement of total fluorescent intermediate compounds (FIC) associated with Maillard reaction, described and evaluated in a cereal-based product. Total...

  12. Energetic and Exergetic Performance Analyses of Solar Dish Based CO2 Combined Cycle

    OpenAIRE

    Mukhopadhyay, Soumitra; Ghosh, Sudip

    2014-01-01

    This paper presents a conceptual configuration of a solar dish based combined cycle power plant with a topping gas turbine block and a bottoming steam turbine cycle coupled through a heat recovery steam generator (HRSG). Carbon dioxide has been considered as the working fluid for the topping cycle and it has been considered in gaseous state all through the cycle. Two-stage compression has been proposed for the carbon dioxide cycle. The conventional GT combustion chamber is replaced by a high-...

  13. UniPrimer: A Web-Based Primer Design Tool for Comparative Analyses of Primate Genomes

    OpenAIRE

    Nomin Batnyam; Jimin Lee; Jungnam Lee; Seung Bok Hong; Sejong Oh; Kyudong Han

    2012-01-01

    Whole genome sequences of various primates have been released due to advanced DNA-sequencing technology. A combination of computational data mining and the polymerase chain reaction (PCR) assay to validate the data is an excellent method for conducting comparative genomics. Thus, designing primers for PCR is an essential procedure for a comparative analysis of primate genomes. Here, we developed and introduced UniPrimer for use in those studies. UniPrimer is a web-based tool that designs PCR-...

  14. Generation and Nonlinear Dynamical Analyses of Fractional-Order Memristor-Based Lorenz Systems

    OpenAIRE

    Huiling Xi; Yuxia Li; Xia Huang

    2014-01-01

    In this paper, four fractional-order memristor-based Lorenz systems with the flux-controlled memristor characterized by a monotone-increasing piecewise linear function, a quadratic nonlinearity, a smooth continuous cubic nonlinearity and a quartic nonlinearity are presented, respectively. The nonlinear dynamics are analyzed by using numerical simulation methods, including phase portraits, bifurcation diagrams, the largest Lyapunov exponent and power spectrum diagrams. Some interesting phenome...

  15. Analysing humanly generated random number sequences: A pattern-based approach

    OpenAIRE

    Gravenor, M B; Schulz, M A; Schmalbach, B; Brugger, P; Witt, K.

    2012-01-01

    In a random number generation task, participants are asked to generate a random sequence of numbers, most typically the digits 1 to 9. Such number sequences are not mathematically random, and both extent and type of bias allow one to characterize the brain's “internal random number generator”. We assume that certain patterns and their variations will frequently occur in humanly generated random number sequences. Thus, we introduce a pattern-based analysis of random number sequences. Twenty he...

  16. Analysing Humanly Generated Random Number Sequences: A Pattern-Based Approach

    OpenAIRE

    Schulz, Marc-André; Schmalbach, Barbara; Brugger, Peter; Witt, Karsten

    2012-01-01

    In a random number generation task, participants are asked to generate a random sequence of numbers, most typically the digits 1 to 9. Such number sequences are not mathematically random, and both extent and type of bias allow one to characterize the brain's “internal random number generator”. We assume that certain patterns and their variations will frequently occur in humanly generated random number sequences. Thus, we introduce a pattern-based analysis of random number sequences. Twenty he...

  17. Model-based analyses of bioequivalence crossover trials using the stochastic approximation expectation maximisation algorithm.

    OpenAIRE

    Dubois, Anne; Lavielle, Marc; Gsteiger, Sandro; Pigeolet, Etienne; Mentré, France

    2011-01-01

    In this work, we develop a bioequivalence analysis using nonlinear mixed effects models (NLMEM) that mimics the standard noncompartmental analysis (NCA). We estimate NLMEM parameters, including between-subject and within-subject variability and treatment, period and sequence effects. We explain how to perform a Wald test on a secondary parameter, and we propose an extension of the likelihood ratio test for bioequivalence. We compare these NLMEM-based bioequivalence tests with standard NCA-bas...

  18. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  19. Agent-based modelling as scientific method: a case study analysing primate social behaviour

    OpenAIRE

    Bryson, Joanna J.; Ando, Yasushi; Lehmann, Hagen

    2007-01-01

    A scientific methodology in general should provide two things: first, a means of explanation and, second, a mechanism for improving that explanation. Agent-based modelling (ABM) is a method that facilitates exploring the collective effects of individual action selection. The explanatory force of the model is the extent to which an observed meta-level phenomenon can be accounted for by the behaviour of its micro-level actors. This article demonstrates that this methodology can be applied to th...

  20. Analysing saturable antibody binding based on serum data and pharmacokinetic modelling

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, Peter; Kiryakos, Hady; Reske, Sven N; Glatting, Gerhard, E-mail: gerhard.glatting@uni-ulm.d, E-mail: peter.kletting@uniklinik-ulm.d [Klinik fuer Nuklearmedizin, Universitaet Ulm, D-89070 Ulm (Germany)

    2011-01-07

    In radioimmunotherapy, organ dose calculations are frequently based on pretherapeutic biodistribution measurements, assuming equivalence between pretherapeutic and therapeutic biodistribution. However, when saturation of antibody binding sites is important, this assumption might not be justified. Residual antibody and different amounts of administered antibody may lead to a considerably altered therapeutic biodistribution. In this study we developed a method based on serum activity measurements to investigate this effect in radioimmunotherapy with {sup 90}Y-labelled anti-CD66 antibody. Pretherapeutic and therapeutic serum activity data of ten patients with acute leukaemia were fitted to a set of four parsimonious pharmacokinetic models. All models included the key mechanisms of antibody binding, immunoreactivity and degradation; however, they differed with respect to linear or nonlinear binding and global or individual fitting of the model parameters. The empirically most supported model was chosen according to the corrected Akaike information criterion. The nonlinear models were most supported by the data (sum of probabilities {approx}100%). Using the presented method, we identified relevant saturable binding for radioimmunotherapy with {sup 90}Y-labelled anti-CD66 antibody solely based on serum data. This general method may also be applicable to investigate other systems where saturation of binding sites might be important.

  1. Analysing saturable antibody binding based on serum data and pharmacokinetic modelling

    Science.gov (United States)

    Kletting, Peter; Kiryakos, Hady; Reske, Sven N.; Glatting, Gerhard

    2011-01-01

    In radioimmunotherapy, organ dose calculations are frequently based on pretherapeutic biodistribution measurements, assuming equivalence between pretherapeutic and therapeutic biodistribution. However, when saturation of antibody binding sites is important, this assumption might not be justified. Residual antibody and different amounts of administered antibody may lead to a considerably altered therapeutic biodistribution. In this study we developed a method based on serum activity measurements to investigate this effect in radioimmunotherapy with 90Y-labelled anti-CD66 antibody. Pretherapeutic and therapeutic serum activity data of ten patients with acute leukaemia were fitted to a set of four parsimonious pharmacokinetic models. All models included the key mechanisms of antibody binding, immunoreactivity and degradation; however, they differed with respect to linear or nonlinear binding and global or individual fitting of the model parameters. The empirically most supported model was chosen according to the corrected Akaike information criterion. The nonlinear models were most supported by the data (sum of probabilities ≈100%). Using the presented method, we identified relevant saturable binding for radioimmunotherapy with 90Y-labelled anti-CD66 antibody solely based on serum data. This general method may also be applicable to investigate other systems where saturation of binding sites might be important.

  2. Integrated Genomic and Network-Based Analyses of Complex Diseases and Human Disease Network.

    Science.gov (United States)

    Al-Harazi, Olfat; Al Insaif, Sadiq; Al-Ajlan, Monirah A; Kaya, Namik; Dzimiri, Nduna; Colak, Dilek

    2016-06-20

    A disease phenotype generally reflects various pathobiological processes that interact in a complex network. The highly interconnected nature of the human protein interaction network (interactome) indicates that, at the molecular level, it is difficult to consider diseases as being independent of one another. Recently, genome-wide molecular measurements, data mining and bioinformatics approaches have provided the means to explore human diseases from a molecular basis. The exploration of diseases and a system of disease relationships based on the integration of genome-wide molecular data with the human interactome could offer a powerful perspective for understanding the molecular architecture of diseases. Recently, subnetwork markers have proven to be more robust and reliable than individual biomarker genes selected based on gene expression profiles alone, and achieve higher accuracy in disease classification. We have applied one of these methodologies to idiopathic dilated cardiomyopathy (IDCM) data that we have generated using a microarray and identified significant subnetworks associated with the disease. In this paper, we review the recent endeavours in this direction, and summarize the existing methodologies and computational tools for network-based analysis of complex diseases and molecular relationships among apparently different disorders and human disease network. We also discuss the future research trends and topics of this promising field. PMID:27318646

  3. Assessing an organizational culture instrument based on the Competing Values Framework: Exploratory and confirmatory factor analyses

    Directory of Open Access Journals (Sweden)

    Mohr David C

    2007-04-01

    Full Text Available Abstract Background The Competing Values Framework (CVF has been widely used in health services research to assess organizational culture as a predictor of quality improvement implementation, employee and patient satisfaction, and team functioning, among other outcomes. CVF instruments generally are presented as well-validated with reliable aggregated subscales. However, only one study in the health sector has been conducted for the express purpose of validation, and that study population was limited to hospital managers from a single geographic locale. Methods We used exploratory and confirmatory factor analyses to examine the underlying structure of data from a CVF instrument. We analyzed cross-sectional data from a work environment survey conducted in the Veterans Health Administration (VHA. The study population comprised all staff in non-supervisory positions. The survey included 14 items adapted from a popular CVF instrument, which measures organizational culture according to four subscales: hierarchical, entrepreneurial, team, and rational. Results Data from 71,776 non-supervisory employees (approximate response rate 51% from 168 VHA facilities were used in this analysis. Internal consistency of the subscales was moderate to strong (α = 0.68 to 0.85. However, the entrepreneurial, team, and rational subscales had higher correlations across subscales than within, indicating poor divergent properties. Exploratory factor analysis revealed two factors, comprising the ten items from the entrepreneurial, team, and rational subscales loading on the first factor, and two items from the hierarchical subscale loading on the second factor, along with one item from the rational subscale that cross-loaded on both factors. Results from confirmatory factor analysis suggested that the two-subscale solution provides a more parsimonious fit to the data as compared to the original four-subscale model. Conclusion This study suggests that there may be problems

  4. Cost analyses of a web-based behavioral intervention to enhance fruit and vegetable consumption

    Directory of Open Access Journals (Sweden)

    McClure Jennifer B

    2009-12-01

    Full Text Available Abstract Background The purpose of this paper is to evaluate costs associated with the online intervention trial, Making Effective Nutritional Choices for Cancer Prevention (MENU, and to connect the findings to the study outcomes. Methods Using prospective data collected during the MENU development and implementation phases, we estimated overall costs per person, incremental costs for the three arms of the MENU intervention, and incremental costs per change in fruit and vegetable (F&V consumption across the studied population. The MENU study was conducted in five HMO sites of the Cancer Research Network. The number of eligible study participants who were enrolled in the study was 2,540. Recruited participants were randomized into (1 an untailored website program, (2 tailored website program, or (3 tailored web program plus personalized counseling (HOBI via email. The primary measures for these analyses include the total intervention costs, average cost per participant, and the average cost per mean change in daily intake of F&V, stratified by study arm. Results The mean change in F&V consumption was greater in both the tailored arm and statistically higher in the HOBI arm relative to the untailored arm. The untailored arm achieved +2.34 servings increase vs. the tailored website arm (+2.68 and the HOBI arm (+2.80 servings increase. Total intervention costs for MENU participants who completed the 12-month follow-up assessment, by study arm, were estimated to be $197,197 or $110 respectively. This translates to $69 per participant in the untailored web site intervention, $81 per participant in the tailored website intervention, and $184 per participant in the HOBI intervention and a cost per average change in F&V consumption to be $35, $27 and $61 respectively. Conclusions Providing personalized "tailored" messages and additional personalized support via email generated an additional $12-$115 per participant, over the untailored web program

  5. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  6. Generation and Nonlinear Dynamical Analyses of Fractional-Order Memristor-Based Lorenz Systems

    Directory of Open Access Journals (Sweden)

    Huiling Xi

    2014-11-01

    Full Text Available In this paper, four fractional-order memristor-based Lorenz systems with the flux-controlled memristor characterized by a monotone-increasing piecewise linear function, a quadratic nonlinearity, a smooth continuous cubic nonlinearity and a quartic nonlinearity are presented, respectively. The nonlinear dynamics are analyzed by using numerical simulation methods, including phase portraits, bifurcation diagrams, the largest Lyapunov exponent and power spectrum diagrams. Some interesting phenomena, such as inverse period-doubling bifurcation and intermittent chaos, are found to exist in the proposed systems.

  7. Analysing the Relationship between Learning Styles and Navigation Behaviour in Web-Based Educational System

    OpenAIRE

    Nabila Bousbia; Issam Rebaï; Jean-Marc Labat; Amar Balla

    2010-01-01

    The aim of our research is to automatically deduce the learning style from the analysis of browsing behaviour. To find how to deduce the learning style, we are investigating, in this paper, the relationships between the learner‟s navigation behaviour and his/her learning style in web-based learning. To explore this relation, we carried out an experiment with 27 students of computer science at the engineering school (ESI-Algeria). The students used a hypermedia course on an e-learning platform...

  8. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  9. Reconstruction of Holocene coastal depositional environments based on sedimentological and palaeontological analyses, Zakynthos Island, Western Greece Mediterranean Sea

    Science.gov (United States)

    Avramidis, Pavlos; Iliopoulos, George; Papadopoulou, Penelope; Nikolaou, Konstantinos; Kontopoulos, Nikolaos; Wijngaarden, Gert

    2014-05-01

    Zakynthos Island is one of the most seismically active regions in Europe and the Holocene coastal depositional environments were influenced both by tectonic activity and sea level rise. In the present study detailed sedimentological, palaeontological and 14C dating analyses were used in order to reconstruct the Holocene coastal depositional environments as well as the different rates of sedimentation, based on data from three cores up to 30 m deep. The results of the analyses indicate changes in depositional environments from marine to brackish lagoonal and lagoon / barrier systems with temporary intrusions of marine water via storms or tsunamigenic events. High sedimentation rates in coastal areas of Zakynthos Island correspond well to the most widespread Holocene warm and humid phases. The interpretation of the sedimentological environments reveals that Zakynthos Island before 8300 BP was constituted by two islands, where the present southern part of the island was separated from the northern one by a shallow and narrow sea channel.

  10. Augmentation of French grunt diet description using combined visual and DNA-based analyses

    Science.gov (United States)

    Hargrove, John S.; Parkyn, Daryl C.; Murie, Debra J.; Demopoulos, Amanda W.J.; Austin, James D.

    2012-01-01

    Trophic linkages within a coral-reef ecosystem may be difficult to discern in fish species that reside on, but do not forage on, coral reefs. Furthermore, dietary analysis of fish can be difficult in situations where prey is thoroughly macerated, resulting in many visually unrecognisable food items. The present study examined whether the inclusion of a DNA-based method could improve the identification of prey consumed by French grunt, Haemulon flavolineatum, a reef fish that possesses pharyngeal teeth and forages on soft-bodied prey items. Visual analysis indicated that crustaceans were most abundant numerically (38.9%), followed by sipunculans (31.0%) and polychaete worms (5.2%), with a substantial number of unidentified prey (12.7%). For the subset of prey with both visual and molecular data, there was a marked reduction in the number of unidentified sipunculans (visual – 31.1%, combined &ndash 4.4%), unidentified crustaceans (visual &ndash 15.6%, combined &ndash 6.7%), and unidentified taxa (visual &ndash 11.1%, combined &ndash 0.0%). Utilising results from both methodologies resulted in an increased number of prey placed at the family level (visual &ndash 6, combined &ndash 33) and species level (visual &ndash 0, combined &ndash 4). Although more costly than visual analysis alone, our study demonstrated the feasibility of DNA-based identification of visually unidentifiable prey in the stomach contents of fish.

  11. Drive-based recording analyses at >800 Gfc/in 2 using shingled recording

    Science.gov (United States)

    William Cross, R.; Montemorra, Michael

    2012-02-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ˜130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond.

  12. Cosmological Analyses Based On The Combined Planck And WMAP Mission Datasets

    Science.gov (United States)

    Bennett, Charles

    We propose to: (1) make a detailed comparison of WMAP, Planck, and other cosmic microwave background (CMB) data to understand areas of conflict, and if possible, resolve them; (2) combine WMAP and Planck data into a unified cosmological dataset; and (3)extend cosmological analyses with the combined data. Recent cosmological measurements have revolutionized cosmology and the CMB has played a crucial role. The Planck mission team just released cosmological data and papers, this on the heels of the WMAP team's release of final nine-year data and papers. This proposal is to compare and attempt to understand the subtle but important differences between the two recently released WMAP and Planck cosmological results, to combine the data so as to benefit from the full available small and larger scale measurements, and to use this to enhance cosmological solutions. The WMAP and Planck CMB cosmology datasets are broadly consistent with one another. Yet, differences exist beyond the fact that Planck data extend to finer angular scales than WMAP data. We propose to go beyond the "quick look" we have done so far to identify and help resolve discrepancies. We provide two examples of the kinds of discrepancies that should be resolved. Even though the Planck data release relied on the absolute calibration established by WMAP the two sets of analyzed data appear to be off by a factor of 0.975. This small but significant discrepancy is difficult to explain and merits investigation. Also, while cosmological parameters from Planck agree with WMAP parameters within 1.1# of the larger WMAP uncertainty, this large a discrepancy is difficult to explain in detail since the cosmic variance uncertainties that play a large role in the parameter uncertainties are common to Planck and WMAP: both missions view the same sky. These are just two examples; additional careful and detailed comparisons are required. Over the course of the last several years a number of scientists around the world

  13. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli. PMID

  14. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  15. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  16. Drive-based recording analyses at >800 Gfc/in{sup 2} using shingled recording

    Energy Technology Data Exchange (ETDEWEB)

    William Cross, R., E-mail: William.r.cross@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States); Montemorra, Michael, E-mail: Mike.r.montemorra@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States)

    2012-02-15

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from {approx}130 to well over 500 Gb/in{sup 2} in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in{sup 2} using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in{sup 2} and beyond. - Research Highlights: > Drive-based recording demonstrations at 805 Gf/in{sup 2} has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. > Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. > Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack

  17. Space nuclear-power reactor design based on combined neutronic and thermal-fluid analyses

    International Nuclear Information System (INIS)

    The design and performance analysis of a space nuclear-power system requires sophisticated analytical capabilities such as those developed during the nuclear rocket propulsion (Rover) program. In particular, optimizing the size of a space nuclear reactor for a given power level requires satisfying the conflicting requirements of nuclear criticality and heat removal. The optimization involves the determination of the coolant void (volume) fraction for which the reactor diameter is a minimum and temperature and structural limits are satisfied. A minimum exists because the critical diameter increases with increasing void fraction, whereas the reactor diameter needed to remove a specified power decreases with void fraction. The purpose of this presentation is to describe and demonstrate our analytical capability for the determination of minimum reactor size. The analysis is based on combining neutronic criticality calculations with OPTION-code thermal-fluid calculations

  18. Fuel assemblies mechanical behaviour improvements based on design changes and loading patterns computational analyses

    International Nuclear Information System (INIS)

    In the past few years, incomplete RCCA insertion events (IRI) have been taking place at some nuclear plants. Large guide thimble distortion caused by high compressive loads together with the irradiation induced material creep and growth, is considered as the primary cause of those events. This disturbing phenomenon is worsened when some fuel assemblies are deformed to the extent that they push the neighbouring fuel assemblies and the distortion is transmitted along the core. In order to better understand this mechanism, ENUSA has developed a methodology based on finite element core simulation to enable assessments on the propensity of a given core loading pattern to propagate the distortion along the core. At the same time, the core loading pattern could be decided interacting with nuclear design to obtain the optimum response under both, nuclear and mechanical point of views, with the objective of progressively attenuating the core distortion. (author)

  19. Scenario-based analyses of energy system development and its environmental implications in Thailand

    International Nuclear Information System (INIS)

    Thailand is one of the fastest growing energy-intensive economies in Southeast Asia. To formulate sound energy policies in the country, it is important to understand the impact of energy use on the environment over the long-period. This study examines energy system development and its associated greenhouse gas and local air pollutant emissions under four scenarios in Thailand through the year 2050. The four scenarios involve different growth paths for economy, population, energy efficiency and penetration of renewable energy technologies. The paper assesses the changes in primary energy supply mix, sector-wise final energy demand, energy import dependency and CO2, SO2 and NO x emissions under four scenarios using end-use based Asia-Pacific Integrated Assessment Model (AIM/Enduse) of Thailand

  20. Surface-Based Analyses of Anatomical Properties of the Visual Cortex in Macular Degeneration.

    Directory of Open Access Journals (Sweden)

    Doety Prins

    Full Text Available Macular degeneration (MD can cause a central visual field defect. In a previous study, we found volumetric reductions along the entire visual pathways of MD patients, possibly indicating degeneration of inactive neuronal tissue. This may have important implications. In particular, new therapeutic strategies to restore retinal function rely on intact visual pathways and cortex to reestablish visual function. Here we reanalyze the data of our previous study using surface-based morphometry (SBM rather than voxel-based morphometry (VBM. This can help determine the robustness of the findings and will lead to a better understanding of the nature of neuroanatomical changes associated with MD.The metrics of interest were acquired by performing SBM analysis on T1-weighted MRI data acquired from 113 subjects: patients with juvenile MD (JMD; n = 34, patients with age-related MD (AMD; n = 24 and healthy age-matched controls (HC; n = 55.Relative to age-matched controls, JMD patients showed a thinner cortex, a smaller cortical surface area and a lower grey matter volume in V1 and V2, while AMD patients showed thinning of the cortex in V2. Neither patient group showed a significant difference in mean curvature of the visual cortex.The thinner cortex, smaller surface area and lower grey matter volume in the visual cortex of JMD patients are consistent with our previous results showing a volumetric reduction in their visual cortex. Finding comparable results using two rather different analysis techniques suggests the presence of marked cortical degeneration in the JMD patients. In the AMD patients, we found a thinner cortex in V2 but not in V1. In contrast to our previous VBM analysis, SBM revealed no volumetric reductions of the visual cortex. This suggests that the cortical changes in AMD patients are relatively subtle, as they apparently can be missed by one of the methods.

  1. Risk-benefit analyses of SG tube maintenance based on probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    As an application of probabilistic fracture mechanics (PFM), a risk-benefit analysis was performed for the purpose of optimizing maintenance activities of steam generator (SG) tubes used in pressurized water reactors (PWRs). The probabilities of the SG tube leakage and rupture are defined as risks in this study. A model was made modifying pc-PRAISE (Piping Reliability Analysis Including Seismic Events) to evaluate the risks during 60 year operations due to stress corrosion cracking (SCC) of the tubes under various maintenance strategies for SG tubes. In the risk analysis, parameters such as inspection accuracy, inspection interval, sampling inspection and crack propagation law were selected for sensitivity analysis. Based on the risk analysis, a risk-benefit analysis was conducted when implementing two maintenance strategies taking both costs and revenues for 60 year operations into account. In the risk-benefit analysis, the expected cost of leakage or rupture was calculated by multiplying 'probability of leakage or rupture' by 'expected loss of leakage or rupture accident'. To justify whether it is worthwhile implementing the maintenance strategies or not, the net present value (NPV) was calculated as an index, which is one of the most fundamental financial indices for decision-making based on the discounted cash flow (DCF) method. The results demonstrated that in the risk analysis, the risks are influenced significantly by the crack propagation law, accuracy of inspection and sampling inspection. In the risk-benefit analysis, it was suggested that investment to improve inspection accuracy would reduce the total costs of 60 year operations significantly and increase the NPV. Although the analysis was mainly conducted for SG tubes made of Inconel 600 mill anneal (MA) material, the analysis was also carried out for Inconel 690 thermal treatment (TT) material, making assumptions on its crack initiation and crack propagation law. In addition, the effect of introducing

  2. Fatigue Crack Propagation Under Variable Amplitude Loading Analyses Based on Plastic Energy Approach

    Directory of Open Access Journals (Sweden)

    Sofiane Maachou

    2014-04-01

    Full Text Available Plasticity effects at the crack tip had been recognized as “motor” of crack propagation, the growth of cracks is related to the existence of a crack tip plastic zone, whose formation and intensification is accompanied by energy dissipation. In the actual state of knowledge fatigue crack propagation is modeled using crack closure concept. The fatigue crack growth behavior under constant amplitude and variable amplitude loading of the aluminum alloy 2024 T351 are analyzed using in terms energy parameters. In the case of VAL (variable amplitude loading tests, the evolution of the hysteretic energy dissipated per block is shown similar with that observed under constant amplitude loading. A linear relationship between the crack growth rate and the hysteretic energy dissipated per block is obtained at high growth rates. For lower growth rates values, the relationship between crack growth rate and hysteretic energy dissipated per block can represented by a power law. In this paper, an analysis of fatigue crack propagation under variable amplitude loading based on energetic approach is proposed.

  3. The Core Competitiveness of the Wisdom Tourism Food Analyses Based on the Internet of Things

    Directory of Open Access Journals (Sweden)

    Qingyun Chen

    2015-07-01

    Full Text Available As the world’s largest industry, the tourism Food industry of the world has rapidly developed in recent years. Transferring “digital food” to “wisdom food” means new opportunities and challenges that the sustainable development of the food should face. However, the informational development of this industry still lags behind, the exploitation and utilization of information resources haven’t owned an effective platform, which lack of a benign circulation and interactive mechanism, therefore, to combine the IOT with the food about wisdom has always been a new tendency. This study constructed framework of the tourism Food core competitiveness based on IOT. Then, this study proposed a measurement model for the tourism Food core competitiveness and test the model through exploratory and confirmatory factor analysis, the indicators demonstrate that the model is effective and present that resource protection ability, operation management ability, service ability and Tourism Food service chain integration ability all have influence on the tourism Food core competitiveness.

  4. UniPrimer: A Web-Based Primer Design Tool for Comparative Analyses of Primate Genomes

    Directory of Open Access Journals (Sweden)

    Nomin Batnyam

    2012-01-01

    Full Text Available Whole genome sequences of various primates have been released due to advanced DNA-sequencing technology. A combination of computational data mining and the polymerase chain reaction (PCR assay to validate the data is an excellent method for conducting comparative genomics. Thus, designing primers for PCR is an essential procedure for a comparative analysis of primate genomes. Here, we developed and introduced UniPrimer for use in those studies. UniPrimer is a web-based tool that designs PCR- and DNA-sequencing primers. It compares the sequences from six different primates (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque and designs primers on the conserved region across species. UniPrimer is linked to RepeatMasker, Primer3Plus, and OligoCalc softwares to produce primers with high accuracy and UCSC In-Silico PCR to confirm whether the designed primers work. To test the performance of UniPrimer, we designed primers on sample sequences using UniPrimer and manually designed primers for the same sequences. The comparison of the two processes showed that UniPrimer was more effective than manual work in terms of saving time and reducing errors.

  5. UniPrimer: A Web-Based Primer Design Tool for Comparative Analyses of Primate Genomes.

    Science.gov (United States)

    Batnyam, Nomin; Lee, Jimin; Lee, Jungnam; Hong, Seung Bok; Oh, Sejong; Han, Kyudong

    2012-01-01

    Whole genome sequences of various primates have been released due to advanced DNA-sequencing technology. A combination of computational data mining and the polymerase chain reaction (PCR) assay to validate the data is an excellent method for conducting comparative genomics. Thus, designing primers for PCR is an essential procedure for a comparative analysis of primate genomes. Here, we developed and introduced UniPrimer for use in those studies. UniPrimer is a web-based tool that designs PCR- and DNA-sequencing primers. It compares the sequences from six different primates (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque) and designs primers on the conserved region across species. UniPrimer is linked to RepeatMasker, Primer3Plus, and OligoCalc softwares to produce primers with high accuracy and UCSC In-Silico PCR to confirm whether the designed primers work. To test the performance of UniPrimer, we designed primers on sample sequences using UniPrimer and manually designed primers for the same sequences. The comparison of the two processes showed that UniPrimer was more effective than manual work in terms of saving time and reducing errors. PMID:22693428

  6. SEM-EBSD based Realistic Modeling and Crystallographic Homogenization FE Analyses of LDH Formability Tests

    Science.gov (United States)

    Kuramae, Hiroyuki; Ngoc Tam, Nguyen; Nakamura, Yasunori; Sakamoto, Hidetoshi; Morimoto, Hideo; Nakamachi, Eiji

    2007-05-01

    Homogenization algorithm is introduced to the elastic/crystalline viscoplastic finite element (FE) procedure to develop multi-scale analysis code to predict the formability of sheet metal in macro scale, and simultaneously the crystal texture and hardening evolutions in micro scale. The isotropic and kinematical hardening lows are employed in the crystalline plasticity constitutive equation. For the multi-scale structure, two scales are considered. One is a microscopic polycrystal structure and the other a macroscopic elastic plastic continuum. We measure crystal morphologies by using the scanning electron microscope (SEM) with electron back scattered diffraction (EBSD), and define a three dimensional representative volume element (RVE) of micro ploycrystal structure, which satisfy the periodicity condition of crystal orientation distribution. Since nonlinear multi-scale FE analysis requires large computation time, development of parallel computing technique is needed. To realize the parallel analysis on PC cluster system, the dynamic explicit FE formulations are employed. Applying the domain partitioning technique to FE mesh of macro continuum, homogenized stresses based on micro crystal structures are computed in parallel without solving simultaneous linear equation. The parallel FEM code is applied to simulate the limit dome height (LDH) test problem and hemispherical cup deep drawing problem of aluminum alloy AL6022, mild steel DQSK, high strength steel HSLA, and dual phase steel DP600 sheet metals. The localized distribution of thickness strain and the texture evolution are obtained.

  7. Pareto frontier analyses based decision making tool for transportation of hazardous waste

    International Nuclear Information System (INIS)

    Highlights: ► Posteriori method using multi-objective approach to solve bi-objective routing problem. ► System optimization (with multiple source–destination pairs) in a capacity constrained network using non-dominated sorting. ► Tools like cost elasticity and angle based focus used to analyze Pareto frontier to aid stakeholders make informed decisions. ► A real life case study of Kolkata Metropolitan Area to explain the workability of the model. - Abstract: Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology.

  8. Contextualising and Analysing Planetary Rover Image Products through the Web-Based PRoGIS

    Science.gov (United States)

    Morley, Jeremy; Sprinks, James; Muller, Jan-Peter; Tao, Yu; Paar, Gerhard; Huber, Ben; Bauer, Arnold; Willner, Konrad; Traxler, Christoph; Garov, Andrey; Karachevtseva, Irina

    2014-05-01

    The international planetary science community has launched, landed and operated dozens of human and robotic missions to the planets and the Moon. They have collected various surface imagery that has only been partially utilized for further scientific purposes. The FP7 project PRoViDE (Planetary Robotics Vision Data Exploitation) is assembling a major portion of the imaging data gathered so far from planetary surface missions into a unique database, bringing them into a spatial context and providing access to a complete set of 3D vision products. Processing is complemented by a multi-resolution visualization engine that combines various levels of detail for a seamless and immersive real-time access to dynamically rendered 3D scenes. PRoViDE aims to (1) complete relevant 3D vision processing of planetary surface missions, such as Surveyor, Viking, Pathfinder, MER, MSL, Phoenix, Huygens, and Lunar ground-level imagery from Apollo, Russian Lunokhod and selected Luna missions, (2) provide highest resolution & accuracy remote sensing (orbital) vision data processing results for these sites to embed the robotic imagery and its products into spatial planetary context, (3) collect 3D Vision processing and remote sensing products within a single coherent spatial data base, (4) realise seamless fusion between orbital and ground vision data, (5) demonstrate the potential of planetary surface vision data by maximising image quality visualisation in 3D publishing platform, (6) collect and formulate use cases for novel scientific application scenarios exploiting the newly introduced spatial relationships and presentation, (7) demonstrate the concepts for MSL, (9) realize on-line dissemination of key data & its presentation by a web-based GIS and rendering tool named PRoGIS (Planetary Robotics GIS). PRoGIS is designed to give access to rover image archives in geographical context, using projected image view cones, obtained from existing meta-data and updated according to

  9. Analysing Amazonian forest productivity using a new individual and trait-based model (TFS v.1

    Directory of Open Access Journals (Sweden)

    N. M. Fyllas

    2014-02-01

    Full Text Available Repeated long-term censuses have revealed large-scale spatial patterns in Amazon Basin forest structure and dynamism, with some forests in the west of the Basin having up to a twice as high rate of aboveground biomass production and tree recruitment as forests in the east. Possible causes for this variation could be the climatic and edaphic gradients across the Basin and/or the spatial distribution of tree species composition. To help understand causes of this variation a new individual-based model of tropical forest growth designed to take full advantage of the forest census data available from the Amazonian Forest Inventory Network (RAINFOR has been developed. The model incorporates variations in tree size distribution, functional traits and soil physical properties and runs at the stand level with four functional traits, leaf dry mass per area (Ma, leaf nitrogen (NL and phosphorus (PL content and wood density (DW used to represent a continuum of plant strategies found in tropical forests. We first applied the model to validate canopy-level water fluxes at three Amazon eddy flux sites. For all three sites the canopy-level water fluxes were adequately simulated. We then applied the model at seven plots, where intensive measurements of carbon allocation are available. Tree-by-tree multi-annual growth rates generally agreed well with observations for small trees, but with deviations identified for large trees. At the stand-level, simulations at 40 plots were used to explore the influence of climate and soil fertility on the gross (ΠG and net (ΠN primary production rates as well as the carbon use efficiency (CU. Simulated ΠG, ΠN and CU were not associated with temperature. However all three measures of stand level productivity were positively related to annual precipitation and soil fertility.

  10. Design and Development of Microcontroller-Based Clinical Chemistry Analyser for Measurement of Various Blood Biochemistry Parameters

    Directory of Open Access Journals (Sweden)

    R. C. Gupta

    2005-01-01

    Full Text Available Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase, amylase, AST (aspartate amino transferase, and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.

  11. GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.

    Science.gov (United States)

    Nazarian, Alireza; Gezan, Salvador Alejandro

    2016-07-01

    Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. PMID:27025440

  12. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO2 emissions of the overall plant. The environmental impact in terms of CO2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  13. Using a laser-based CO2 carbon isotope analyser to investigate gas transfer in geological media

    International Nuclear Information System (INIS)

    CO2 stable carbon isotopes are very attractive in environmental research to investigate both natural and anthropogenic carbon sources. Laser-based CO2 carbon isotope analysis provides continuous measurement at high temporal resolution and is a promising alternative to isotope ratio mass spectrometry (IRMS). We performed a thorough assessment of a commercially available CO2 Carbon Isotope Analyser (CCIA DLT-100, Los Gatos Research) that allows in situ measurement of C-13 in CO2. Using a set of reference gases of known CO2 concentration and carbon isotopic composition, we evaluated the precision, long-term stability, temperature sensitivity and concentration dependence of the analyser. Despite good precision calculated from Allan variance (5.0 ppm for CO2 concentration, and 0.05 per thousand for δC-13 at 60 s averaging), real performances are altered by two main sources of error: temperature sensitivity and dependence of C-13 on CO2 concentration. Data processing is required to correct for these errors. Following application of these corrections, we achieve an accuracy of 8.7 ppm for CO2 concentration and 1.3 per thousand for δC-13, which is worse compared to mass spectrometry performance, but still allowing field applications. With this portable analyser we measured CO2 flux degassed from rock in an underground tunnel. The obtained carbon isotopic composition agrees with IRMS measurement, and can be used to identify the carbon source. (authors)

  14. Analysing Amazonian forest productivity using a new individual and trait-based model (TFS v.1)

    Science.gov (United States)

    Fyllas, N. M.; Gloor, E.; Mercado, L. M.; Sitch, S.; Quesada, C. A.; Domingues, T. F.; Galbraith, D. R.; Torre-Lezama, A.; Vilanova, E.; Ramírez-Angulo, H.; Higuchi, N.; Neill, D. A.; Silveira, M.; Ferreira, L.; Aymard C., G. A.; Malhi, Y.; Phillips, O. L.; Lloyd, J.

    2014-07-01

    Repeated long-term censuses have revealed large-scale spatial patterns in Amazon basin forest structure and dynamism, with some forests in the west of the basin having up to a twice as high rate of aboveground biomass production and tree recruitment as forests in the east. Possible causes for this variation could be the climatic and edaphic gradients across the basin and/or the spatial distribution of tree species composition. To help understand causes of this variation a new individual-based model of tropical forest growth, designed to take full advantage of the forest census data available from the Amazonian Forest Inventory Network (RAINFOR), has been developed. The model allows for within-stand variations in tree size distribution and key functional traits and between-stand differences in climate and soil physical and chemical properties. It runs at the stand level with four functional traits - leaf dry mass per area (Ma), leaf nitrogen (NL) and phosphorus (PL) content and wood density (DW) varying from tree to tree - in a way that replicates the observed continua found within each stand. We first applied the model to validate canopy-level water fluxes at three eddy covariance flux measurement sites. For all three sites the canopy-level water fluxes were adequately simulated. We then applied the model at seven plots, where intensive measurements of carbon allocation are available. Tree-by-tree multi-annual growth rates generally agreed well with observations for small trees, but with deviations identified for larger trees. At the stand level, simulations at 40 plots were used to explore the influence of climate and soil nutrient availability on the gross (ΠG) and net (ΠN) primary production rates as well as the carbon use efficiency (CU). Simulated ΠG, ΠN and CU were not associated with temperature. On the other hand, all three measures of stand level productivity were positively related to both mean annual precipitation and soil nutrient status

  15. Investigation of a wet ethanol operated HCCI engine based on first and second law analyses

    International Nuclear Information System (INIS)

    are in the HCCI engine (around 89%) followed by fuel vaporizer (4.9%) and catalytic converter (4.5%). → Based on simulation results, it is found that second law efficiency of wet ethanol operated HCCI engine is higher than the pure ethanol fuelled HCCI engine.

  16. Energy and exergy analyses of a new four-step copper-chlorine cycle for geothermal-based hydrogen production

    International Nuclear Information System (INIS)

    In this paper, energy and exergy analyses of the geothermal-based hydrogen production via thermochemical water decomposition using a new, four-step copper-chlorine (Cu-Cl) cycle are conducted, and the respective cycle energy and exergy efficiencies are examined. Also, a parametric study is performed to investigate how each step of the cycle and its overall cycle performance are affected by reference environment temperatures, reaction temperatures, as well as energy efficiency of the geothermal power plant itself. As a result, overall energy and exergy efficiencies of the cycle are found to be 21.67% and 19.35%, respectively, for a reference case.

  17. Non-target screening analyses of organic contaminants in river systems as a base for monitoring measures

    Science.gov (United States)

    Schwarzbauer, J.

    2009-04-01

    Organic contaminants discharged to the aquatic environment exhibit a high diversity with respect to their molecular structures and the resulting physico-chemical properties. The chemical analysis of anthropogenic contamination in river systems is still an important feature, especially with respect to (i) the identification and structure elucidation of novel contaminants, (ii) to the characterisation of their environmental behaviour and (iii) to their risk for natural systems. A huge proportion of riverine contamination is caused by low-molecular weight organic compounds, like pesticides plasticizers, pharmaceuticals, personal care products, technical additives etc. Some of them, like PCB or PAH have already been investigated thoroughly and, consequently, their behaviour in aqueous systems is very well described. Although analyses on organic substances in river water traditionally focused on selected pollutants, in particular on common priority pollutants which are monitored routinely, the occurrence of further contaminants, e.g. pharmaceuticals, personal care products or chelating agents has received increasing attention within the last decade. Accompanied, screening analyses revealing an enormous diversity of low-molecular weight organic contaminants in waste water effluents and river water become more and more noticed. Since many of these substances have been rarely noticed so far, it will be an important task for the future to study their occurrence and fate in natural environments. Further on, it should be a main issue of environmental studies to provide a comprehensive view on the state of pollution of river water, in particular with respect to lipophilic low molecular weight organic contaminants. However, such non-target-screening analyses has been performed only rarely in the past. Hence, we applied extended non-target screening analyses on longitudinal sections of the rivers Rhine, Rur and Lippe (Germany) on the base of GC/MS analyses. The investigations

  18. A comparison between evapotranspiration estimates based on remotely sensed surface energy balance and ground-based soil water balance analyses

    Science.gov (United States)

    Remotely sensed and in-situ data were used to investigate dynamics of root zone soil moisture and evapotranspiration (ET) at four Mesonet stations in north-central Oklahoma over an 11-year period (2000-2010). Two moisture deficit indicators based on soil matric potential had spatial and temporal pat...

  19. The phylogenetic relationship of the family Lutjanidae based on analyses of AFLP and mitochondrial 12S rRNA sequences

    Institute of Scientific and Technical Information of China (English)

    ZHANG Junbin; LIU Xin

    2006-01-01

    Fishes of the family Lutjanidae are commercially important in South China Sea. However,the phylogeny of Lutjanids is still unclear and there are many controversies over it. Herein, studies about the phylogeny of Lutjanids were performed based on Amplified Fragment Length Polymorphism (AFLP) analysis of genome DNA and sequence analysis of mitochondrial 12S rRNA gene, and 10 Lutjanidae species and 1 Lethrinidae species were employed.The topologies of minimum evolution (ME) trees based on the two analyses respectively were congruent except for positions of genera Pristipomoides and Caesio. The optimal substitution model TrN + G for DNA sequences of 12S rRNA genes in Lutjanids was obtained using MODELTEST 3.6 software and maximum likelihood (ML) analysis supports the topology displayed by the ME tree. The test of log-likelihood suggests that the use of molecular clock calibrations to estimate species divergence time appeared valid. Phylogenetic analyses using AFLP data and DNA sequences of mitochondrial 12S rRNA genes indicated the monophyly of Lutjanus genra. However, further studies are required to reveal the phylogenetic relationship among other genera. In addition, the results demonstrated that AFLP genetic marker was suitable for the phylogenetic analysis of Lutjanids.

  20. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  1. A numerical investigation on exergy analyses of a pyroelectric tryglycine sulfate (TGS)-based solar energy harvesting system

    Science.gov (United States)

    Sharma, Manish; Vaish, Rahul; Singh Chauhan, Vishal

    2016-02-01

    This study is based on a numerical demonstration of energy and exergy analyses of a solar energy harvesting system based on the pyroelectric effect. The performance of a tryglycine sulfate (TGS) single crystal was investigated mathematically in the present study. The power output was optimized for different load resistances and load capacitances. The maximum power output was obtained as 0.95 μW across a load resistance of 40 MΩ and a 4.7 μF load capacitor. Further exergy analysis was performed for a pyroelectric energy harvesting system. Maximum values for electrical and thermal exergies obtained are 0.12 μW and 12 mW, respectively. Furthermore the maximum obtained electrical and thermal exergy efficiencies are 0.000 037% and 3.6%, respectively. The average thermal exergy efficiency is 2.15% for a cycle frequency of 0.014 Hz.

  2. Estimating Intermittency Exponent in Neutrally Stratified Atmospheric Surface Layer Flows: A Robust Framework based on Magnitude Cumulant and Surrogate Analyses

    CERN Document Server

    Basu, S; Lashermes, B; Arnéodo, A; Basu, Sukanta; Foufoula-Georgiou, Efi; Lashermes, Bruno; Arneodo, Alain

    2007-01-01

    This study proposes a novel framework based on magnitude cumulant and surrogate analyses to reliably detect and estimate the intermittency coefficient from short-length coarse-resolution turbulent time series. Intermittency coefficients estimated from a large number of neutrally stratified atmospheric surface layer turbulent series from various field campaigns are shown to remarkably concur with well-known laboratory experimental results. In addition, a surrogate-based hypothesis testing framework is proposed and shown to significantly reduce the likelihood of detecting a spurious non-zero intermittency coefficient from non-intermittent series. The discriminatory power of the proposed framework is promising for addressing the unresolved question of how atmospheric stability affects the intermittency properties of boundary layer turbulence.

  3. Analyses and Simulation of V-I Characteristics for Solar Cells Based on P-N Junction

    Institute of Scientific and Technical Information of China (English)

    ZHENG Jian-bang; REN Ju; GUO Wen-ge; HOU Chao-qi

    2005-01-01

    Through theoretical analyses of the Shockley equation and the difference between a practical P-N junction and its ideal model, the mathematical models of P-N junction and solar cells had been obtained. With Matlab software, the V-I characteristics of diodes and solar cells were simulated, and a computer simulation model of the solar cells based on P-N junction was also established. Based on the simulation model, the influences of solar cell's internal resistances on open-circuit voltage and short-circuit current under certain illumination were numerically analyzed and solved. The simulation results showed that the equivalent series resistance and shunt resistance could strongly affect the V-I characteristics of solar cell, but their influence styles were different.

  4. An Evaluation Quality Framework for Analysing School-Based Learning (SBL) to Work-Based Learning (WBL) Transition Module

    Science.gov (United States)

    Alseddiqi, M.; Mishra, R.; Pislaru, C.

    2012-05-01

    The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.

  5. An Evaluation Quality Framework for Analysing School-Based Learning (SBL) to Work-Based Learning (WBL) Transition Module

    International Nuclear Information System (INIS)

    The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.

  6. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  7. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  8. FastGroupII: A web-based bioinformatics platform for analyses of large 16S rDNA libraries

    Directory of Open Access Journals (Sweden)

    McNairnie Pat

    2006-02-01

    Full Text Available Abstract Background High-throughput sequencing makes it possible to rapidly obtain thousands of 16S rDNA sequences from environmental samples. Bioinformatic tools for the analyses of large 16S rDNA sequence databases are needed to comprehensively describe and compare these datasets. Results FastGroupII is a web-based bioinformatics platform to dereplicate large 16S rDNA libraries. FastGroupII provides users with the option of four different dereplication methods, performs rarefaction analysis, and automatically calculates the Shannon-Wiener Index and Chao1. FastGroupII was tested on a set of 16S rDNA sequences from coral-associated Bacteria. The different grouping algorithms produced similar, but not identical, results. This suggests that 16S rDNA datasets need to be analyzed in multiple ways when being used for community ecology studies. Conclusion FastGroupII is an effective bioinformatics tool for the trimming and dereplication of 16S rDNA sequences. Several standard diversity indices are calculated, and the raw sequences are prepared for downstream analyses.

  9. Detection of intestinal parasites by use of the cuvette-based automated microscopy analyser sediMAX(®).

    Science.gov (United States)

    Intra, J; Taverna, E; Sala, M R; Falbo, R; Cappellini, F; Brambilla, P

    2016-03-01

    Microscopy is the reference method for intestinal parasite identification. The cuvette-based automated microscopy analyser, sediMAX 1, provides 15 digital images of each sediment sample. In this study, we have evaluated this fully automated instrument for detection of enteric parasites, helminths and protozoa. A total of 700 consecutively preserved samples consisting of 60 positive samples (50 protozoa, ten helminths) and 640 negative samples were analysed. Operators were blinded to each others' results. Samples were randomized and were tested both by manual microscopy and sediMAX 1 for parasite recognition. The sediMAX 1 analysis was conducted using a dilution of faecal samples, allowing determination of morphology. The data obtained using sediMAX 1 showed a specificity of 100% and a sensitivity of 100%. Some species of helminths, such as Enterobius vermicularis, Strongyloides stercolaris, the Ancylostoma duodenale/Necator americanus complex, and schistosomes were not considered in this work, because they are rare in stool specimens, are not easily detectable with microscopy analysis, and require specific recovery techniques. This study demonstrated for the first time that sediMAX 1 can be an aid in enteric parasite identification. PMID:26679923

  10. Formalisation des bases méthodologiques et conceptuelles d'une analyse spatiale des accidents de la route

    Directory of Open Access Journals (Sweden)

    Florence Huguenin Richard

    1999-06-01

    Full Text Available Cet article pose les bases méthodologiques et conceptuelles d’une analyse spatiale du risque routier. L’étude de ce phénomène requiert une masse importante de données qui décrivent différentes dimensions de l’accident et qui peuvent être gérées dans un système d’information géographique. Elle demande aussi une réflexion méthodologique sur la cartographie du risque, les échelles d’observation, l’agrégation de données qualitatives et quantitatives, l’utilisation de méthodes statistiques adaptées au risque routier et l'intégration de l’espace comme facteur d’insécurité.

  11. Soil heavy metal contamination related to roasted stone coal slag: a study based on geostatistical and multivariate analyses.

    Science.gov (United States)

    Li, De'an; Jiang, Jianguo; Li, Tianran; Wang, Jiaming

    2016-07-01

    Soil was examined for vanadium (V) and related metal contamination near a stone coal mine in Hubei Province, China. In total, 92 surface and vertical (0-200 cm) soil samples were collected from the site. A handheld X-ray fluorescence spectrometer was used for in situ analysis of the soil concentrations of heavy metals, including V, chromium (Cr), copper (Cu), manganese (Mn), zinc (Zn), and lead (Pb). The mean concentrations of these metals were 931, 721, 279, 223, 163, and 11 mg/kg, respectively. Based on the Chinese Environmental Quality Standard for Soils guidelines, up to 88.0, 76.1, and 56.5 % of the soil samples had single factor pollution indices >3 for V, Cr, and Cu, respectively. Furthermore, 2.2 % of samples were slightly polluted with Zn, while there was no Mn or Pb contamination. GaussAmp curve fitting was performed based on the sample frequency distribution of the Nemerow pollution index. The fitted mean was 5.99, indicating severe pollution. The heavy metals were clustered into two groups, V/Cr/Cu/Zn and Mn/Pb, based on the spatial distributions, the Pearson correlation and principal component analyses. The positive correlations within the V/Cr/Cu/Zn group suggested that they originated from roasted stone coal slag. Finally, the negative correlation between the two groups was attributed to mechanical mixing of the slag and original soil. PMID:27068897

  12. Analyses of the soil surface dynamic of South African Kalahari salt pans based on hyperspectral and multitemporal data

    Science.gov (United States)

    Milewski, Robert; Chabrillat, Sabine; Behling, Robert; Mielke, Christian; Schleicher, Anja Maria; Guanter, Luis

    2016-04-01

    The consequences of climate change represent a major threat to sustainable development and growth in Southern Africa. Understanding the impact on the geo- and biosphere is therefore of great importance in this particular region. In this context the Kalahari salt pans (also known as playas or sabkhas) and their peripheral saline and alkaline habitats are an ecosystem of major interest. They are very sensitive to environmental conditions, and as thus hydrological, mineralogical and ecological responses to climatic variations can be analysed. Up to now the soil composition of salt pans in this area have been only assessed mono-temporally and on a coarse regional scale. Furthermore, the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. High spectral resolution remote sensing can estimate evaporite content and mineralogy of soils based on the analyses of the surface reflectance properties within the Visible-Near InfraRed (VNIR 400-1000 nm) and Short-Wave InfraRed (SWIR 1000-2500 nm) regions. In these wavelength regions major chemical components of the soil interact with the electromagnetic radiation and produce characteristic absorption features that can be used to derive the properties of interest. Although such techniques are well established for the laboratory and field scale, the potential of current (Hyperion) and upcoming spaceborne sensors such as EnMAP for quantitative mineralogical and salt spectral mapping is still to be demonstrated. Combined with hyperspectral methods, multitemporal remote sensing techniques allow us to derive the recent dynamic of these salt pans and link the mineralogical analysis of the pan surface to major physical processes in these dryland environments. In this study we focus on the analyses of the Namibian Omongwa salt pans based on satellite hyperspectral imagery and multispectral time-series data. First, a change detection analysis is applied using the Iterative

  13. The neuronal correlates of digits backward are revealed by voxel-based morphometry and resting-state functional connectivity analyses.

    Directory of Open Access Journals (Sweden)

    Rui Li

    Full Text Available Digits backward (DB is a widely used neuropsychological measure that is believed to be a simple and effective index of the capacity of the verbal working memory. However, its neural correlates remain elusive. The aim of this study is to investigate the neural correlates of DB in 299 healthy young adults by combining voxel-based morphometry (VBM and resting-state functional connectivity (rsFC analyses. The VBM analysis showed positive correlations between the DB scores and the gray matter volumes in the right anterior superior temporal gyrus (STG, the right posterior STG, the left inferior frontal gyrus and the left Rolandic operculum, which are four critical areas in the auditory phonological loop of the verbal working memory. Voxel-based correlation analysis was then performed between the positive rsFCs of these four clusters and the DB scores. We found that the DB scores were positively correlated with the rsFCs within the salience network (SN, that is, between the right anterior STG, the dorsal anterior cingulate cortex and the right fronto-insular cortex. We also found that the DB scores were negatively correlated with the rsFC within an anti-correlation network of the SN, between the right posterior STG and the left posterior insula. Our findings suggest that DB performance is related to the structural and functional organizations of the brain areas that are involved in the auditory phonological loop and the SN.

  14. Systematic Reviews and Meta-Analyses - Literature-based Recommendations for Evaluating Strengths, Weaknesses, and Clinical Value.

    Science.gov (United States)

    Beitz, Janice M; Bolton, Laura L

    2015-11-01

    Good quality systematic reviews (SRs) summarizing best available evidence can help inform clinical decisions, improv- ing patient and wound outcomes. Weak SRs can misinform readers, undermining care decisions and evidence-based practice. To examine the strengths and weaknesses of SRs and meta-analyses and the role of SRs in contemporary evidence-based wound care practice, and using the search terms systematic review, meta-analysis, and evidence-based practice, the authors searched Medline and the Cumulative Index to Nursing and Allied Health Literature (CINAHL) for important terminology and recommendations to help clinicians evaluate SRs with meta-analysis. Reputable websites, recent textbooks, and synthesized available literature also were reviewed to describe and summarize SR strengths and weaknesses. After developing a checklist for critically evaluating SR objectives, inclusion/exclusion criteria, study quality, data extraction and synthesis methods, meta-analysis homogeneity, accuracy of results, interpretation, and consistency between significant findings and abstract or conclusions, the checklist was applied to topical wound care SRs identified in Cochrane and MEDLINE searches. Best available evidence included in the SRs from 169 randomized controlled trials on 11,571 patients supporting topical intervention healing effects on burns, surgical sites, and diabetic, venous, or pressure ulcers was summarized and showed SRs and clinical trials can demonstrate different outcomes because the information/data are compiled differently. The results illustrate how evidence insufficient to support firm conclusions may still meet immediate needs to guide carefully considered clinical wound and patient care decisions while encouraging better future science. PMID:26544016

  15. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    Directory of Open Access Journals (Sweden)

    Arika Ligmann-Zielinska

    Full Text Available Agent-based models (ABMs have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1 efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2 conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  16. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99mTc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  17. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  18. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. PMID:27343617

  19. Voxel-based analyses of gray/white matter volume and diffusion tensor data in major depression. Presidential award proceedings

    International Nuclear Information System (INIS)

    Previous neuroimaging studies have revealed that frontolimbic dysfunction may contribute to the pathophysiology of major depressive disorder. We used voxel-based analysis to simultaneously elucidate regional changes in gray/white matter volume, mean diffusivity (MD), and fractional anisotropy (FA) in the central nervous system of patients with unipolar major depression. We studied 21 right-handed patients and 42 age- and gender-matched right-handed normal subjects without central nervous system disorders. All image processing and statistical analyses were performed using SPM5 software. Local areas showing significant gray matter volume reduction in depressive patients compared with normal controls were observed in the right parahippocampal gyrus, hippocampus, bilateral middle frontal gyri, bilateral anterior cingulate cortices, left parietal and occipital lobes, and right superior temporal gyrus. Local areas showing increased mean diffusivity in depressive patients were observed in the bilateral parahippocampal gyri, hippocampus, pons, cerebellum, left frontal and temporal lobes, and right frontal lobe. There was no significant difference between the 2 groups for fractional anisotropy and white matter volume in the entire brain. Although there was no local area in which FA and MD were significantly correlated with disease severity, FA tended to correlate negatively with depression days (total accumulated days in depressive state) in the right anterior cingulate and the left frontal white matter (FDR-corrected P=0.055 for both areas). These results suggest that the frontolimbic neural circuit may play an important role in the neuropathology of patients with major depression. (author)

  20. Multigene-based analyses on evolutionary phylogeny of two controversial ciliate orders: Pleuronematida and Loxocephalida (Protista, Ciliophora, Oligohymenophorea).

    Science.gov (United States)

    Gao, Feng; Katz, Laura A; Song, Weibo

    2013-07-01

    Relationships among members of the ciliate subclass Scuticociliatia (Ciliophora, Oligohymenophorea) are largely unresolved. Phylogenetic studies of its orders Pleuronematida and Loxocephalida were initially based on small subunit ribosomal RNA gene (SSU-rDNA) analyses of a limited number of taxa. Here we characterized 37 sequences (SSU-rDNA, ITS-5.8S and LSU-rDNA) from 21 taxonomically controversial members of these orders. Phylogenetic trees constructed to assess the inter- and intra-generic relationships of pleuronematids and loxocephalids reveal the following: (1) the order Loxocephalida and its two families Loxocephalidae and Cinetochilidae are not monophyletic when more taxa are added; (2) the core pleuronematids are divided into two fully supported clades, however, the order Pleuronematida is not monophyletic because Cyclidium glaucoma is closer to Thigmotrichida; (3) the family Pleuronematidae and the genus Schizocalyptra are monophyletic, though rDNA sequences of Pleuronema species are highly variable; (4) Pseudoplatynematum and Sathrophilus are closely related to the subclass Astomatia, while Cinetochilum forms a monophyletic group with the subclass Apostomatia; and (5) Hippocomos falls in the order Pleuronematida and is closely related to Eurystomatellidae and Cyclidium plouneouri. Further, in an effort to provide a better resolution of evolutionary relationships, the secondary structures of ITS2 transcripts and the variable region 4 (V4) of the small subunit ribosomal RNA (SSU-rRNA) are predicted, revealing that ITS2 structures are conserved at the order level while V4 region structures are more variable than ITS2 structures. PMID:23541839

  1. Phylogenetic analyses of cyclidiids (Protista, Ciliophora, Scuticociliatia) based on multiple genes suggest their close relationship with thigmotrichids.

    Science.gov (United States)

    Gao, Feng; Gao, Shan; Wang, Pu; Katz, Laura A; Song, Weibo

    2014-06-01

    Cyclidiids and thigmotrichids are two diverse groups of scuticociliates, a diverse clade of ciliates that is often difficult to investigate due to the small size and conserved morphology among its members. Compared to other groups (e.g. hypotrichs and oligotrichs), the scuticociliates have received relatively little attention and their phylogenetic relationships are largely unresolved. To contribute to our understanding of their evolutionary history, we characterized 26 sequences for three linked genes (SSU-rDNA, 5.8S and LSU-rDNA) from 14 isolates of cyclidiids and thigmotrichids. Phylogenetic analyses reveal the following: (1) traditional cyclidiids are associated with thigmotrichs rather than pleuronematids as expected; (2) the validity of the newly-reported genus Falcicyclidium is confirmed by the molecular data and we suggest to transfer this genus to the family Ctedoctematidae; (3) both the genera Cyclidium and Protocyclidium are not monophyletic and the separation of Protocyclidium from Cyclidium is not supported; (4) the genus Cristigera is a well supported monophyletic group and may stand for a new family; (5) according to both morphological and molecular information, Cyclidium plouneouriDragesco, 1963 should be assigned in the genus Falcicyclidium and thus a new combination is suggested: Falcicyclidium plouneouri (Dragesco, 1963) n. comb.; and (6) based on the data available, a new genus is suggested: Acucyclidium gen. nov. with the type species, Acucyclidium atractodes (Fan et al., 2011a) n. comb. PMID:24530638

  2. High temperature gas-cooled pebble bed reactor steady state thermal-hydraulics analyses based on CFD method

    International Nuclear Information System (INIS)

    Background: Based on general purpose CFD code Fluent, the PBMR-400 full load nominal condition thermal-hydraulics performance was studied by applying local thermal non-equilibrium porous media model. Purpose: In thermal hydraulics study of the gas cooled pebble bed reactor, the core of the reactor can be treated as macroscopic porous media with strong inner heat source, and the original Fluent code can not handle it properly. Methods: By introducing a UDS in the calculation domain of the reactor core and subjoining a new resistance term, we develop a non-equilibrium porous media model which can give an accurate description of the core of the pebble bed. The mesh of CFD code is finer than that of the traditional pebble bed reactor thermal hydraulics analysis code such as THERMIX and TINTE, thus more information about coolant velocity fields, temperature field and solid phase temperature field can be acquired. Results: The nominal condition calculation results of the CFD code are compared to those of the well-established thermal-hydraulic code THERMIX and TINTE, and show a good consistency. Conclusion: The extended local thermal non-equilibrium model can be used to analyse thermal-hydraulics of high temperature pebble bed type reactor. (authors)

  3. What Happened, and Why: Toward an Understanding of Human Error Based on Automated Analyses of Incident Reports. Volume 1

    Science.gov (United States)

    Maille, Nicolas P.; Statler, Irving C.; Ferryman, Thomas A.; Rosenthal, Loren; Shafto, Michael G.; Statler, Irving C.

    2006-01-01

    The objective of the Aviation System Monitoring and Modeling (ASMM) project of NASA s Aviation Safety and Security Program was to develop technologies that will enable proactive management of safety risk, which entails identifying the precursor events and conditions that foreshadow most accidents. This presents a particular challenge in the aviation system where people are key components and human error is frequently cited as a major contributing factor or cause of incidents and accidents. In the aviation "world", information about what happened can be extracted from quantitative data sources, but the experiential account of the incident reporter is the best available source of information about why an incident happened. This report describes a conceptual model and an approach to automated analyses of textual data sources for the subjective perspective of the reporter of the incident to aid in understanding why an incident occurred. It explores a first-generation process for routinely searching large databases of textual reports of aviation incident or accidents, and reliably analyzing them for causal factors of human behavior (the why of an incident). We have defined a generic structure of information that is postulated to be a sound basis for defining similarities between aviation incidents. Based on this structure, we have introduced the simplifying structure, which we call the Scenario as a pragmatic guide for identifying similarities of what happened based on the objective parameters that define the Context and the Outcome of a Scenario. We believe that it will be possible to design an automated analysis process guided by the structure of the Scenario that will aid aviation-safety experts to understand the systemic issues that are conducive to human error.

  4. Scientific LogAnalyzer: a web-based tool for analyses of server log files in psychological research.

    Science.gov (United States)

    Reips, Ulf-Dietrich; Stieger, Stefan

    2004-05-01

    Scientific LogAnalyzer is a platform-independent interactive Web service for the analysis of log files. Scientific LogAnalyzer offers several features not available in other log file analysis tools--for example, organizational criteria and computational algorithms suited to aid behavioral and social scientists. Scientific LogAnalyzer is highly flexible on the input side (unlimited types of log file formats), while strictly keeping a scientific output format. Features include (1) free definition of log file format, (2) searching and marking dependent on any combination of strings (necessary for identifying conditions in experiment data), (3) computation of response times, (4) detection of multiple sessions, (5) speedy analysis of large log files, (6) output in HTML and/or tab-delimited form, suitable for import into statistics software, and (7) a module for analyzing and visualizing drop-out. Several methodological features specifically needed in the analysis of data collected in Internet-based experiments have been implemented in the Web-based tool and are described in this article. A regression analysis with data from 44 log file analyses shows that the size of the log file and the domain name lookup are the two main factors determining the duration of an analysis. It is less than a minute for a standard experimental study with a 2 x 2 design, a dozen Web pages, and 48 participants (ca. 800 lines, including data from drop-outs). The current version of Scientific LogAnalyzer is freely available for small log files. Its Web address is http://genpsylab-logcrunsh.unizh.ch/. PMID:15354696

  5. Analysing EWviews

    DEFF Research Database (Denmark)

    Jelsøe, Erling; Jæger, Birgit

    2015-01-01

    When analysing the results of a European wide citizen consultation on sustainable consumption it is necessary to take a number of issues into account, such as the question of representativity and tensions between national and European identies and between consumer and Citizen orientations regarding...

  6. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    Science.gov (United States)

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  7. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  8. Insights into the phylogeny of sporadotrichid ciliates (Protozoa, Ciliophora: Hypotricha) based on genealogical analyses of multiple molecular markers

    Science.gov (United States)

    Hu, Xiaoyan; Hu, Xiaozhong; Al-Rasheid, Khaled A. S.; Al-Farraj, Saleh A.; Song, Weibo

    2011-01-01

    The sporadotrichid ciliates are an especially diverse group. A number of investigators have studied the morphological, morphogenetic, and molecular relationships among members of this group. Despite this, a consistent classification is still lacking and several important questions about the phylogenetic relationships within this group remain unsolved. To improve our understanding of these relationships, we constructed phylogenetic trees using the nucleotide sequences of the small-subunit rRNA (SSrRNA) gene and amino acid sequences of actin I and α-tubulin. Analyses of SSrRNA gene sequences indicated that: 1) the Sporadotrichida sensu Lynn (2008) and the Oxytrichidae are polyphyletic; 2) the Uroleptus species, which are classified to urostylids, formed a sister group with the oxytrichids; 3) Halteria grandinella, which is grouped morphologically with oligotrich species, clustered within the oxytrichids. These results are congruent with previous studies based on SSrRNA gene sequences. However, the amino acid sequences of actin I and α-tubulin yielded different topologies. The main results are: 1) in all phylogenetic trees, the genus Oxytricha was paraphyletic; 2) Uroleptus was sister to a subset of Urostyla and Holosticha, albeit with low supporting values; 3) Halteria grandinella was separated distantly from the Oxytrichidae in trees inferred from actin I amino acid sequences but clustered with oligotrichids in the α-tubulin analysis. The inconsistency among the trees inferred from these different molecular markers may be caused by rapidly accumulated genetic characterizations of ciliates. Further studies with additional molecular markers and sampling of more taxa are expected to better address the relationships among sporadotrichids.

  9. Pseudomonas community structure and antagonistic potential in the rhizosphere: insights gained by combining phylogenetic and functional gene-based analyses.

    Science.gov (United States)

    Costa, Rodrigo; Gomes, Newton C M; Krögerrecklenfort, Ellen; Opelt, Katja; Berg, Gabriele; Smalla, Kornelia

    2007-09-01

    The Pseudomonas community structure and antagonistic potential in the rhizospheres of strawberry and oilseed rape (host plants of the fungal phytopathogen Verticillium dahliae) were assessed. The use of a new PCR-DGGE system, designed to target Pseudomonas-specific gacA gene fragments in environmental DNA, circumvented common biases of 16S rRNA gene-based DGGE analyses and proved to be a reliable tool to unravel the diversity of uncultured Pseudomonas in bulk and rhizosphere soils. Pseudomonas-specific gacA fingerprints of total-community (TC) rhizosphere DNA were surprisingly diverse, plant-specific and differed markedly from those of the corresponding bulk soils. By combining multiple culture-dependent and independent surveys, a group of Pseudomonas isolates antagonistic towards V. dahliae was shown to be genotypically conserved, to carry the phlD biosynthetic locus (involved in the biosynthesis of 2,4-diacetylphloroglucinol - 2,4-DAPG), and to correspond to a dominant and highly frequent Pseudomonas population in the rhizosphere of field-grown strawberries planted at three sites in Germany which have different land use histories. This population belongs to the Pseudomonas fluorescens phylogenetic lineage and showed closest relatedness to P. fluorescens strain F113 (97% gacA gene sequence identity in 492-bp sequences), a biocontrol agent and 2,4-DAPG producer. Partial gacA gene sequences derived from isolates, clones of the strawberry rhizosphere and DGGE bands retrieved in this study represent previously undescribed Pseudomonas gacA gene clusters as revealed by phylogenetic analysis. PMID:17686023

  10. Comprehensive phylogenetic reconstruction of amoebozoa based on concatenated analyses of SSU-rDNA and actin genes.

    Directory of Open Access Journals (Sweden)

    Daniel J G Lahr

    Full Text Available Evolutionary relationships within Amoebozoa have been the subject of controversy for two reasons: 1 paucity of morphological characters in traditional surveys and 2 haphazard taxonomic sampling in modern molecular reconstructions. These along with other factors have prevented the erection of a definitive system that resolves confidently both higher and lower-level relationships. Additionally, the recent recognition that many protosteloid amoebae are in fact scattered throughout the Amoebozoa suggests that phylogenetic reconstructions have been excluding an extensive and integral group of organisms. Here we provide a comprehensive phylogenetic reconstruction based on 139 taxa using molecular information from both SSU-rDNA and actin genes. We provide molecular data for 13 of those taxa, 12 of which had not been previously characterized. We explored the dataset extensively by generating 18 alternative reconstructions that assess the effect of missing data, long-branched taxa, unstable taxa, fast evolving sites and inclusion of environmental sequences. We compared reconstructions with each other as well as against previously published phylogenies. Our analyses show that many of the morphologically established lower-level relationships (defined here as relationships roughly equivalent to Order level or below are congruent with molecular data. However, the data are insufficient to corroborate or reject the large majority of proposed higher-level relationships (above the Order-level, with the exception of Tubulinea, Archamoebae and Myxogastrea, which are consistently recovered. Moreover, contrary to previous expectations, the inclusion of available environmental sequences does not significantly improve the Amoebozoa reconstruction. This is probably because key amoebozoan taxa are not easily amplified by environmental sequencing methodology due to high rates of molecular evolution and regular occurrence of large indels and introns. Finally, in an effort

  11. Discrimination, correlation, and provenance of Bed I tephrostratigraphic markers, Olduvai Gorge, Tanzania, based on multivariate analyses of phenocryst compositions

    Science.gov (United States)

    Habermann, Jörg M.; McHenry, Lindsay J.; Stollhofen, Harald; Tolosana-Delgado, Raimon; Stanistreet, Ian G.; Deino, Alan L.

    2016-06-01

    The chronology of Pleistocene flora and fauna, including hominin remains and associated Oldowan industries in Bed I, Olduvai Gorge, Tanzania, is primarily based on 40Ar/39Ar dating of intercalated tuffs and lavas, combined with detailed tephrostratigraphic correlations within the basin. Although a high-resolution chronostratigraphic framework has been established for the eastern part of the Olduvai Basin, the western subbasin is less well known due in part to major lateral facies changes within Bed I combined with discontinuous exposure. We address these correlation difficulties using the discriminative power of the chemical composition of the major juvenile mineral phases (augite, anorthoclase, plagioclase) from tuffs, volcaniclastic sandstones, siliciclastic units, and lavas. We statistically evaluate these compositions, obtained from electron probe micro-analysis, applying principal component analysis and discriminant analysis to develop discriminant models that successfully classify most Bed I volcanic units. The correlations, resulting from integrated analyses of all target minerals, provide a basin-wide Bed I chemostratigraphic framework at high lateral and vertical resolution, consistent with the known geological context, that expands and refines the geochemical databases currently available. Correlation of proximal ignimbrites at the First Fault with medial and distal Lower Bed I successions of the western basin enables assessment of lateral facies and thickness trends that confirm Ngorongoro Volcano as the primary source for Lower Bed I, whereas Upper Bed I sediment supply is mainly from Olmoti Volcano. Compositional similarity between Tuff IA, Bed I lava, and Mafic Tuffs II and III single-grain fingerprints, together with north- and northwestward thinning of Bed I lava, suggests a common Ngorongoro source for these units. The techniques applied herein improve upon previous work by evaluating compositional affinities with statistical rigor rather than

  12. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  13. Analysing a reading strategy based on the elaboration of questions and the pair of answers and questions

    OpenAIRE

    Wilmo Ernesto Francisco Junior

    2011-01-01

    This paper describes a reading activity developed with chemistry students. The main aim was to analyse the reflections produced after reading three articles about experimentation. This study was performed with 17 chemistry students from a federal university. The reading strategy involved writing productions. Questions and the pair of questions and answers elaborated from articles were analysed, as well as the contribution of the socialization of knowledge by means of discussions and a debate....

  14. Process of Integrating Screening and Detailed Risk-based Modeling Analyses to Ensure Consistent and Scientifically Defensible Results

    International Nuclear Information System (INIS)

    To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders

  15. Meta-Analyses of Human Cell-Based Cardiac Regeneration Therapies: What Can Systematic Reviews Tell Us About Cell Therapies for Ischemic Heart Disease?

    Science.gov (United States)

    Martin-Rendon, Enca

    2016-04-15

    Controversies from basic science, discrepancies from clinical trials, and divergent results from meta-analyses have recently arisen in the field of cell therapies for cardiovascular repair and regeneration. Noticeably, there are almost as many systematic reviews and meta-analyses published as there are well-conducted clinical studies. But how do we disentangle the confusion they have raised? This article addresses why results obtained from systematic reviews and meta-analyses of human cell-based cardiac regeneration therapies are still valid to inform the design of future clinical trials. It also addresses how meta-analyses are not free from limitations and how important it is to assess the quality of the evidence and the quality of the systematic reviews and finally how stronger conclusions can be drawn when several pieces of evidence converge. PMID:27081109

  16. Genetic variation in Opisthorchis viverrini (Trematoda: Opisthorchiidae) from northeast Thailand and Laos PDR based on random amplified polymorphic DNA analyses

    OpenAIRE

    Sithithaworn, Paiboon; Nuchjungreed, Chadaporn; Srisawangwong, Tuanchai; Ando, Katsuhiko; Petney, Trevor N.; Chilton, Neil B.; Andrews, Ross H.

    2006-01-01

    Genetic variation in Opisthorchis viverrini adults originating from different locations in northeast Thailand and Laos, People’s Democratic Republic (PDR), was examined using random amplified polymorphic DNA (RAPD) analyses. In an initial analysis, the genomic DNA of one fluke from each of ten localities was amplified using 15 random primers (10-mers); however, genetic variation among O. viverrini specimens was detected reliably for only four primers. A more detailed RAPD analysis using these...

  17. Parsimony and model-based analyses of indels in avian nuclear genes reveal congruent and incongruent phylogenetic signals.

    Science.gov (United States)

    Yuri, Tamaki; Kimball, Rebecca T; Harshman, John; Bowie, Rauri C K; Braun, Michael J; Chojnowski, Jena L; Han, Kin-Lan; Hackett, Shannon J; Huddleston, Christopher J; Moore, William S; Reddy, Sushma; Sheldon, Frederick H; Steadman, David W; Witt, Christopher C; Braun, Edward L

    2013-01-01

    Insertion/deletion (indel) mutations, which are represented by gaps in multiple sequence alignments, have been used to examine phylogenetic hypotheses for some time. However, most analyses combine gap data with the nucleotide sequences in which they are embedded, probably because most phylogenetic datasets include few gap characters. Here, we report analyses of 12,030 gap characters from an alignment of avian nuclear genes using maximum parsimony (MP) and a simple maximum likelihood (ML) framework. Both trees were similar, and they exhibited almost all of the strongly supported relationships in the nucleotide tree, although neither gap tree supported many relationships that have proven difficult to recover in previous studies. Moreover, independent lines of evidence typically corroborated the nucleotide topology instead of the gap topology when they disagreed, although the number of conflicting nodes with high bootstrap support was limited. Filtering to remove short indels did not substantially reduce homoplasy or reduce conflict. Combined analyses of nucleotides and gaps resulted in the nucleotide topology, but with increased support, suggesting that gap data may prove most useful when analyzed in combination with nucleotide substitutions. PMID:24832669

  18. Parsimony and Model-Based Analyses of Indels in Avian Nuclear Genes Reveal Congruent and Incongruent Phylogenetic Signals

    Directory of Open Access Journals (Sweden)

    Frederick H. Sheldon

    2013-03-01

    Full Text Available Insertion/deletion (indel mutations, which are represented by gaps in multiple sequence alignments, have been used to examine phylogenetic hypotheses for some time. However, most analyses combine gap data with the nucleotide sequences in which they are embedded, probably because most phylogenetic datasets include few gap characters. Here, we report analyses of 12,030 gap characters from an alignment of avian nuclear genes using maximum parsimony (MP and a simple maximum likelihood (ML framework. Both trees were similar, and they exhibited almost all of the strongly supported relationships in the nucleotide tree, although neither gap tree supported many relationships that have proven difficult to recover in previous studies. Moreover, independent lines of evidence typically corroborated the nucleotide topology instead of the gap topology when they disagreed, although the number of conflicting nodes with high bootstrap support was limited. Filtering to remove short indels did not substantially reduce homoplasy or reduce conflict. Combined analyses of nucleotides and gaps resulted in the nucleotide topology, but with increased support, suggesting that gap data may prove most useful when analyzed in combination with nucleotide substitutions.

  19. Paleoenvironmental reconstruction based on palynofacies analyses of the Cansona Formation (Late Cretaceous), Sinú-San Jacinto Basin, northwest Colombia

    Science.gov (United States)

    Juliao-Lemus, Tatiana; Carvalho, Marcelo de Araujo; Torres, Diego; Plata, Angelo; Parra, Carlos

    2016-08-01

    To reconstruct the paleoenvironments of the Cansona Formation, a Cretaceous succession in Colombia that has controversial paleoenvironmental interpretation, occasionally deep marine and occasionally shallow marine, palynofacies analyses were conducted on 93 samples from four sections of the Sinú San Jacinto Basin in the north, midwest, and southwest sectors. For the palynofacies analyses, the kerogen categories were counted and subjected to cluster analyses. Four palynofacies associations were revealed for the four sections: Palynofacies Association I (PA I), which consisted of microforaminiferal linings, scolecodonts, dinoflagellate cysts, pollen grains, and fungi hyphae; PA II, which consisted of phytoclast translucent non-biostructured and biostructured, opaque phytoclasts (equidimensional and lath shaped); PA III, which consisted of pseudoamorphous particles, cuticles, resin, and fungal spores; and PA IV, which consisted of fluorescent and non-fluorescent amorphous organic matter and the fresh-water algae Botryococcus. In contrast to early studies that suggested a generalization of the depositional environment for the Cansona Formation (deep or shallow conditions), this study suggests that the formation reflects conspicuous stratigraphic and lateral changes and hence different depositional environments. The Cerro Cansona (CC4 section) and Chalán (AP section) areas are a more marine proximal settings (Early Campanian-Maastrichtian), and there is an intermediate setting for the Lorica area (SC section) and deeper conditions for the Montería area (CP2 section).

  20. Florida Bay salinity and Everglades wetlands hydrology circa 1900 CE: A compilation of paleoecology-based statistical modeling analyses

    Science.gov (United States)

    Marshall, F.E.; Wingard, G.L.

    2012-01-01

    Throughout the 20th century, the Greater Everglades Ecosystem of south Florida was greatly altered by human activities. Construction of water-control structures and facilities altered the natural hydrologic patterns of the south Florida region and consequently impacted the coastal ecosystem. Restoration of the Greater Everglades Ecosystem is guided by the Comprehensive Everglades Restoration Plan (CERP), which is attempting to reverse some of the impacts of water management. In order to achieve this goal, it is essential to understand the predevelopment conditions (circa 1900 Common Era, CE) of the natural system, including the estuaries. The purpose of this report is to use empirical data derived from analyses of estuarine sediment cores and observations of modern hydrologic and salinity conditions to provide information on the natural system circa 1900 CE. A three-phase approach, developed in 2009, couples paleosalinity estimates derived from sediment cores to upstream hydrology using statistical models prepared from existing monitoring data. Results presented here update and improve previous analyses. A statistical method of estimating the paleosalinity from the core information improves the previous assemblage analyses, and the system of linear regression models was significantly upgraded and expanded.

  1. Analyse - technologies

    International Nuclear Information System (INIS)

    In this chapter of the DCC 1999 scientific report, the following theoretical studies are detailed: emulsions characterization by ultrasonics, high resolution wavelength meter, optimization methodology for diffractive and hybrid optic system, reliability for fast switches in power electronics, study of cesium isolation in irradiated fuels, chemical optodes based on evanescent wave absorption, radionuclides (Zirconium 93 and molybdenum 93) determination in irradiated fuels processing effluents, study of viscous liquid ultrafiltration using supercritical CO2 fluid. (A.L.B.)

  2. A robust University-NGO partnership: Analysing school efficiencies in Bolivia with community-based management techniques

    OpenAIRE

    Joao Neiva de Figueiredo; Ann Marie Jursca Keffer; Miguel Angel Marca Barrientos; Silvana Gonzalez

    2013-01-01

    Community-based management research is a collaborative effort between management, academics and communities in need with the specific goal of achieving social change to foster social justice. Because it is designed to promote and validate joint methods of discovery and community-based sources of knowledge, community-based management research has several unique characteristics, which may affect its execution. This article describes the process of a community-based management research project w...

  3. Scaling and wavelet-based analyses of the long-term heart rate variability of the Eastern Oyster

    CERN Document Server

    Ritto, P A; Alvarado-Gil, J J

    2004-01-01

    Characterisations of the long--term behaviour of heart rate variability in humans have emerged in the last few years as promising candidates to became clinically significant tools. We present two different statistical analyses of long time recordings of the heart rate variation in the Eastern Oyster. The circulatory system of this marine mollusk has important anatomical and physiological dissimilitudes in comparison to that of humans and it is exposed to dramatically different environmental influences. Our results resemble those previously obtained in humans. This suggests that in spite of the discrepancies, the mechanisms of long--term cardiac control on both systems share a common underlying dynamic.

  4. Long-term fertilization alters chemically-separated soil organic carbon pools: Based on stable C isotope analyses

    Science.gov (United States)

    Dou, Xiaolin; He, Ping; Cheng, Xiaoli; Zhou, Wei

    2016-01-01

    Quantification of dynamics of soil organic carbon (SOC) pools under the influence of long-term fertilization is essential for predicting carbon (C) sequestration. We combined soil chemical fractionation with stable C isotope analyses to investigate the C dynamics of the various SOC pools after 25 years of fertilization. Five types of soil samples (0-20, 20-40 cm) including the initial level (CK) and four fertilization treatments (inorganic nitrogen fertilizer, IN; balanced inorganic fertilizer, NPK; inorganic fertilizer plus farmyard manure, MNPK; inorganic fertilizer plus corn straw residue, SNPK) were separated into recalcitrant and labile fractions, and the fractions were analysed for C content, C:N ratios, δ13C values, soil C and N recalcitrance indexes (RIC and RIN). Chemical fractionation showed long-term MNPK fertilization strongly increased the SOC storage in both soil layers (0-20 cm = 1492.4 gC m2 and 20-40 cm = 1770.6 gC m2) because of enhanced recalcitrant C (RC) and labile C (LC). The 25 years of inorganic fertilizer treatment did not increase the SOC storage mainly because of the offsetting effects of enhanced RC and decreased LC, whereas no clear SOC increases under the SNPK fertilization resulted from the fast decay rates of soil C.

  5. Analysing the Correlation between Social Network Analysis Measures and Performance of Students in Social Network-Based Engineering Education

    Science.gov (United States)

    Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav

    2016-01-01

    Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…

  6. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  7. A robust University-NGO partnership: Analysing school efficiencies in Bolivia with community-based management techniques

    Directory of Open Access Journals (Sweden)

    Joao Neiva de Figueiredo

    2013-09-01

    Full Text Available Community-based management research is a collaborative effort between management, academics and communities in need with the specific goal of achieving social change to foster social justice. Because it is designed to promote and validate joint methods of discovery and community-based sources of knowledge, community-based management research has several unique characteristics, which may affect its execution. This article describes the process of a community-based management research project which is descriptive in nature and uses quantitative techniques to examine school efficiencies in low-income communities in a developing country – Bolivia. The article describes the partnership between a US-based university and a Bolivian not-for-profit organisation, the research context and the history of the research project, including its various phases. It focuses on the (yet unpublished process of the community-based research as opposed to its content (which has been published elsewhere. The article also makes the case that the robust partnership between the US-based university and the Bolivian NGO has been a determining factor in achieving positive results. Strengths and limitations are examined in the hope that the experience may be helpful to others conducting descriptive quantitative management research using community-engaged frameworks in cross-cultural settings. Keywords: international partnership, community-engaged scholarship, education efficiency, multicultural low-income education.

  8. A novel method of analysing the damping function of VSC based facts control : multi-machine power systems

    Energy Technology Data Exchange (ETDEWEB)

    Du, W. [Southeast Univ., Nanjing (China)]|[Bath Univ., Bath (United Kingdom); Wang, H.F. [Queen' s Univ. of Belfast, Belfast (United Kingdom); Dunn, R. [Bath Univ., Bath (United Kingdom)

    2008-07-01

    Oscillations in power transmission can be mitigated by adding a supplementary damping function to the normal control task of a flexible AC transmission systems (FACTS). When properly designed, FACTS stabilizers can contribute positive damping to power system oscillations. This paper proposed and presented a general analytical approach to study the effectiveness of damping control. The approach was used to describe the damping function of voltage source converter (VSC) based FACTS stabilizers. The proposed method was derived based on the equal-area criterion and small-signal stability analysis. The paper discussed a single-machine infinite-bus power system installed with a VSC based FACTS device; the damping effect of VSC based FACTS damping control; and demonstrations of example power systems. These examples included a shunt VSC based FACTS with damping control; interaction of FACTS voltage and damping control; feedback signal of FACTS damping control; and extension to the case of a multi-machine power system. The effectiveness of VSC based FACTS stabilizers in damping power system oscillations was clearly explained with the proposed method. The proposed method could also be readily extended to multi-machine power systems. 6 refs., 7 figs.

  9. Präferenzmessung für Automobile mit alternativen Antriebssystemen - Eine Anwendung adaptiver hybrider Verfahren der Choice-based-Conjoint-Analyse

    OpenAIRE

    Bauer, Robert

    2015-01-01

    In dieser Arbeit werden Präferenzen für Automobile mit alternativen Antriebssystemen in Deutschland untersucht. Für die Präferenzanalyse werden adaptive hybride Verfahren der Choice-based-Conjoint-Analyse verwendet. Die Ergebnisse der empirischen Studie zeigen, dass sich die bestehende Heterogenität der Antriebspräferenzen anhand individueller psychografischer Merkmale erklären lässt. Zum Beispiel bevorzugen besonders umweltbewusst eingestellte Personen batterie-elektrische Antriebe im Vergle...

  10. Development and validation of a preference based measure derived from the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR) for use in cost utility analyses

    OpenAIRE

    Meads David M; Ratcliffe Julie; McKenna Stephen P; Brazier John E

    2008-01-01

    Abstract Background Pulmonary Hypertension is a severe and incurable disease with poor prognosis. A suite of new disease-specific measures – the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR) – was recently developed for use in this condition. The purpose of this study was to develop and validate a preference based measure from the CAMPHOR that could be used in cost-utility analyses. Methods Items were selected that covered major issues covered by the CAMPHOR QoL scale (activities,...

  11. Waste dumps rehabilitation measures based on physico-chemical analyses in Zăghid mining area (Sălaj County, Romania)

    OpenAIRE

    Ildiko M. Varga; Ramona Bălc; Cristian V. Maloş; Cristian Şamşudean; Florin Borbei

    2011-01-01

    The present study deals with an abandoned coal mine from Zăghid area, North-WesternTransylvanian Basin (Sălaj County). The mining activity was stopped in 2005, without any attempt ofecological rehabilitation of the mined area and especially of the waste dumps left behind. The proposedrehabilitation models are based on some physical-chemical analyses of soil and waste samples (e.g. pH,EC, Salinity, humidity, porosity, density, plasticity, organic substances, mineralogical composition, heavymet...

  12. Development of a standard data base for FBR core nuclear design. 7. Advances in JUPITER experiment analyses

    International Nuclear Information System (INIS)

    The present report compiles the advances in experiment analyses of JUPITER, which was joint research programs between U.S.DOE and PNC of Japan, using the Zero Power Physics Reactor (ZPPR) large fast critical facility at ANL-Idaho in 1978 to 1988. The advances here are use of the latest nuclear data library and the application of analytical methods which treat mechanisms in more detail or use fewer modeling approximations. As a result of using the latest nuclear data library, C/E values of nearly all characteristics approached unity, and the discrepancies between cores were reduced. Thus it is shown that the latest data library is effective for an analysis of nuclear characteristics. Further, an advance in analytical methods brought C/E value close to unity, and it clarifies the causes of differences between the calculational and experimental values. It is judged that the JUPITER integral data are very effective for the production of the unified nuclear constants which are intended for the core design of the demonstration fast breeder reactor. Furthermore, for improved accuracy in the analytical system, an advance in analytical methods for the evaluation of flux distributions in blanket regions is very effective. Especially, the accuracy for radially heterogeneous cores would be greatly improved by such an advance. (J.P.N.). 146 refs

  13. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of _ are also given.

  14. Effects of can parameters on canned-forging process of TiAl base alloy(Ⅰ)--Microstructural analyses

    Institute of Scientific and Technical Information of China (English)

    刘咏; 韦伟峰; 黄伯云; 何双珍; 周科朝; 贺跃辉

    2002-01-01

    By using thermal simulation technique, the conventional canned-forging process of TiAl based alloy was studied. The effect of can parameters on the microstruct ures of TiAl alloy was analyzed in this process. The results show that, the defo rmation microstructure of TiAl based alloy without canning is inhomogeneous. In lateral area, crack and shearing lines can be found; while in central area, fine -grained shearing zone can be found. The effect of can is to reduce the seconda ry tensile stress. However, only when the deformation of the steel can is coinci dental with that of TiAl alloy ingot, can this effect be effective. Moreover, a thick can would enhance the microstructural homogeneity in TiAl based alloy. With the H/D ratio of the ingot increasing, the deformation of TiAl alloy would be more unsteady, therefore, a thicker can should be needed.

  15. Breast density changes associated with postmenopausal hormone replacement therapy: post hoc radiologist- and computer-based analyses

    DEFF Research Database (Denmark)

    Nielsen, Mads; Pettersen, Paola; Alexandersen, P;

    2010-01-01

    (1 mg) continuously combined with drospirenone (2 mg) was administered to postmenopausal women for up to 2 years (26 treatment cycles, 28 d/cycle) in a randomized, placebo-controlled trial. This post hoc analysis assessed the changes in breast density measured from digitized images by two radiologist...... mineral density at the spine and femur were also assessed. Results: Breast density assessed by the radiologist-based approaches increased significantly from baseline in the HT group (P < 0.01), with significant divergence from placebo at 2 years (P < 0.01). Heterogeneity examination of radiograph score by...... computer-based technique was unchanged in the HT group and decreased significantly with placebo (P < 0.001) to produce a significant group divergence (P < 0.05). Changes in mammographic markers by radiologist- and computer-based approaches correlated with each other in the HT group (P < 0.01) but not in...

  16. Are decisions using cost-utility analyses robust to choice of SF-36/SF-12 preference-based algorithm?

    Directory of Open Access Journals (Sweden)

    Walton Surrey M

    2005-03-01

    Full Text Available Abstract Background Cost utility analysis (CUA using SF-36/SF-12 data has been facilitated by the development of several preference-based algorithms. The purpose of this study was to illustrate how decision-making could be affected by the choice of preference-based algorithms for the SF-36 and SF-12, and provide some guidance on selecting an appropriate algorithm. Methods Two sets of data were used: (1 a clinical trial of adult asthma patients; and (2 a longitudinal study of post-stroke patients. Incremental costs were assumed to be $2000 per year over standard treatment, and QALY gains realized over a 1-year period. Ten published algorithms were identified, denoted by first author: Brazier (SF-36, Brazier (SF-12, Shmueli, Fryback, Lundberg, Nichol, Franks (3 algorithms, and Lawrence. Incremental cost-utility ratios (ICURs for each algorithm, stated in dollars per quality-adjusted life year ($/QALY, were ranked and compared between datasets. Results In the asthma patients, estimated ICURs ranged from Lawrence's SF-12 algorithm at $30,769/QALY (95% CI: 26,316 to 36,697 to Brazier's SF-36 algorithm at $63,492/QALY (95% CI: 48,780 to 83,333. ICURs for the stroke cohort varied slightly more dramatically. The MEPS-based algorithm by Franks et al. provided the lowest ICUR at $27,972/QALY (95% CI: 20,942 to 41,667. The Fryback and Shmueli algorithms provided ICURs that were greater than $50,000/QALY and did not have confidence intervals that overlapped with most of the other algorithms. The ICUR-based ranking of algorithms was strongly correlated between the asthma and stroke datasets (r = 0.60. Conclusion SF-36/SF-12 preference-based algorithms produced a wide range of ICURs that could potentially lead to different reimbursement decisions. Brazier's SF-36 and SF-12 algorithms have a strong methodological and theoretical basis and tended to generate relatively higher ICUR estimates, considerations that support a preference for these algorithms over the

  17. Ein empirischer Vergleich der Prozessaufzeichnungsmethoden Mouselab und Eyetracking bei Präferenzmessungen mittels Choice-based Conjoint Analyse

    DEFF Research Database (Denmark)

    Meissner, Martin; Decker, Reinhold; Pfeiffer, Jella

    2010-01-01

    In choice-based conjoint (CBC) analysis respondents’ decisions in choice settings are used to determine relevant attributes and attribute levels of the products considered. Yet, the cognitive process preceding the choice decision is usually ignored. The eye tracking technique can be used to gain...

  18. Re-examining the phylogeny of clinically relevant Candida species and allied genera based on multigene analyses

    NARCIS (Netherlands)

    Tsui, Clement K M; Daniel, Heide-Marie; Robert, Vincent; Meyer, Wieland

    2008-01-01

    Yeasts of the artificial genus Candida include plant endophytes, insect symbionts, and opportunistic human pathogens. Phylogenies based on rRNA gene and actin sequences confirmed that the genus is not monophyletic, and the relationships among Candida species and allied teleomorph genera are not clea

  19. A Structural Equation Model to Analyse the Antecedents to Students' Web-Based Problem-Solving Performance

    Science.gov (United States)

    Hwang, Gwo-Jen; Kuo, Fan-Ray

    2015-01-01

    Web-based problem-solving, a compound ability of critical thinking, creative thinking, reasoning thinking and information-searching abilities, has been recognised as an important competence for elementary school students. Some researchers have reported the possible correlations between problem-solving competence and information searching ability;…

  20. Multi-parametric cytometry from a complex cellular sample: Improvements and limits of manual versus computational-based interactive analyses.

    Science.gov (United States)

    Gondois-Rey, F; Granjeaud, S; Rouillier, P; Rioualen, C; Bidaut, G; Olive, D

    2016-05-01

    The wide possibilities opened by the developments of multi-parametric cytometry are limited by the inadequacy of the classical methods of analysis to the multi-dimensional characteristics of the data. While new computational tools seemed ideally adapted and were applied successfully, their adoption is still low among the flow cytometrists. In the purpose to integrate unsupervised computational tools for the management of multi-stained samples, we investigated their advantages and limits by comparison to manual gating on a typical sample analyzed in immunomonitoring routine. A single tube of PBMC, containing 11 populations characterized by different sizes and stained with 9 fluorescent markers, was used. We investigated the impact of the strategy choice on manual gating variability, an undocumented pitfall of the analysis process, and we identified rules to optimize it. While assessing automatic gating as an alternate, we introduced the Multi-Experiment Viewer software (MeV) and validated it for merging clusters and annotating interactively populations. This procedure allowed the finding of both targeted and unexpected populations. However, the careful examination of computed clusters in standard dot plots revealed some heterogeneity, often below 10%, that was overcome by increasing the number of clusters to be computed. MeV facilitated the identification of populations by displaying both the MFI and the marker signature of the dataset simultaneously. The procedure described here appears fully adapted to manage homogeneously high number of multi-stained samples and allows improving multi-parametric analyses in a way close to the classic approach. © 2016 International Society for Advancement of Cytometry. PMID:27059253

  1. Metagenome-based diversity analyses suggest a significant contribution of non-cyanobacterial lineages to carbonate precipitation in modern microbialites

    Directory of Open Access Journals (Sweden)

    Purificacion eLopez-Garcia

    2015-08-01

    Full Text Available Cyanobacteria are thought to play a key role in carbonate formation due to their metabolic activity, but other organisms carrying out oxygenic photosynthesis (photosynthetic eukaryotes or other metabolisms (e.g. anoxygenic photosynthesis, sulfate reduction, may also contribute to carbonate formation. To obtain more quantitative information than that provided by more classical PCR-dependent methods, we studied the microbial diversity of microbialites from the Alchichica crater lake (Mexico by mining for 16S/18S rRNA genes in metagenomes obtained by direct sequencing of environmental DNA. We studied samples collected at the Western (AL-W and Northern (AL-N shores of the lake and, at the latter site, along a depth gradient (1, 5, 10 and 15 m depth. The associated microbial communities were mainly composed of bacteria, most of which seemed heterotrophic, whereas archaea were negligible. Eukaryotes composed a relatively minor fraction dominated by photosynthetic lineages, diatoms in AL-W, influenced by Si-rich seepage waters, and green algae in AL-N samples. Members of the Gammaproteobacteria and Alphaproteobacteria classes of Proteobacteria, Cyanobacteria and Bacteroidetes were the most abundant bacterial taxa, followed by Planctomycetes, Deltaproteobacteria (Proteobacteria, Verrucomicrobia, Actinobacteria, Firmicutes and Chloroflexi. Community composition varied among sites and with depth. Although cyanobacteria were the most important bacterial group contributing to the carbonate precipitation potential, photosynthetic eukaryotes, anoxygenic photosynthesizers and sulfate reducers were also very abundant. Cyanobacteria affiliated to Pleurocapsales largely increased with depth. Scanning electron microscopy (SEM observations showed considerable areas of aragonite-encrusted Pleurocapsa-like cyanobacteria at microscale. Multivariate statistical analyses showed a strong positive correlation of Pleurocapsales and Chroococcales with aragonite formation at

  2. A systematic literature review on reviews and meta-analyses of biologically based CAM-practices for cancer patients

    DEFF Research Database (Denmark)

    Paludan-Müller, Christine; Lunde, Anita; Johannessen, Helle

    2010-01-01

    for the improvement of quality of life. Breast cancer was the most common single type of cancer reviewed (8 reviews), all focused on the relief of side effects, primarily by supplements containing soy/plant hormones. The use of these supplements should be discouraged due to a risk for progression of breast cancer...... patients. There is a need for future studies on specific type of practices in relation to side effects and quality of life parameters. Research Methods: -Systematic reviews, meta analysis Funding: The research was funded by The Danish Cancer Society and the University of Southern Denmark.......Purpose To provide an overview and evaluate the evidence of biologically based CAM-practices for cancer patients. Methods Pubmed, Social Science Citation Index, AMED and the Cochrane library were systematically searched for reviews on effects of biologically based CAM-practices, including herbal...

  3. Urban gravity model based on cross-correlation function and Fourier analyses of spatio-temporal process

    International Nuclear Information System (INIS)

    This paper is devoted to gaining a new insight into urban physics in the right perspective. The conventional urban gravity model based on Newton's Law of Universal Gravitation is made into a new expression based on a cross-correlation function. Endowed with a time-lag parameter and time functions, the developed model can integrate temporal dimension into spatial process of cities. A pair of gravity spectra can be given for spatial interaction of any two cities with Fourier transform, and the series of attraction quantity are proved to be the average values of interaction volumes from the traditional model. The method is applied to four cities in China, illustrating how to employ the improved model to characterize the spatio-temporal process of urban interaction. The new gravity model reveals the relationship between the concept of energy and the notion of interaction, and suggests asymmetric interaction between cities, commonly observed in the real world.

  4. High resolution monitoring of marine protists based on an observation strategy integrating automated on board ship filtration and molecular analyses

    OpenAIRE

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschlaeger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-01-01

    Information on recent photosynthetic biomass distribution and biogeography with adequate temporal and spatial resolution is urgently needed to better understand consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high resolution assessment of marine protists in space and time. The observation strategy is the result of extensive technology developments, adaptations and evaluations which are documented in a number o...

  5. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    Science.gov (United States)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  6. SNP array-based copy number and genotype analyses for preimplantation genetic diagnosis of human unbalanced translocations

    OpenAIRE

    van Uum, Chris MJ; Stevens, Servi JC; Dreesen, Joseph CFM; Drüsedau, Marion; Smeets, Hubert J.; Hollanders-Crombach, Bertien; Die-Smulders, Christine EM de; Geraedts, Joep PM; Engelen, John JM; Coonen, Edith

    2012-01-01

    Preimplantation genetic diagnosis (PGD) for chromosomal rearrangements (CR) is mainly based on fluorescence in situ hybridisation (FISH). Application of this technique is limited by the number of available fluorochromes, the extensive preclinical work-up and technical and interpretative artefacts. We aimed to develop a universal, off-the-shelf protocol for PGD by combining single-nucleotide polymorphism (SNP) array-derived copy number (CN) determination and genotyping for detection of unbalan...

  7. The Implemetation of Interventions for Problem Behavior Based on the Results of Precursor Functional Analyses in the Early Childhood Setting

    OpenAIRE

    Halversen, Hayley

    2016-01-01

    This study consisted of three parts. We first used a video observation method and statistical analysis to identify benign behaviors that occurred before the problem behavior. These benign behaviors are known as precursor behaviors. We then used a precursor functional analysis to identify the function of the precursor behaviors. Lastly, we developed and implemented an intervention based on the results of the precursor functional analysis. The interventions effectively reduced problem behavior ...

  8. Flood Mapping and Flood Dynamics of the Mekong Delta: ENVISAT-ASAR-WSM Based Time Series Analyses

    OpenAIRE

    Künzer, Claudia; Guo, Huadong; Huth, Juliane; Leinenkugel, Patrick; Li, Xinwu; Dech, Stefan

    2013-01-01

    Satellite remote sensing is a valuable tool for monitoring flooding. Microwave sensors are especially appropriate instruments, as they allow the differentiation of inundated from non-inundated areas, regardless of levels of solar illumination or frequency of cloud cover in regions experiencing substantial rainy seasons. In the current study we present the longest synthetic aperture radar-based time series of flood and inundation information derived for the Mekong Delta that has been analyzed ...

  9. Thermometrie, Säure-Base-Biamperometrie und wechselspannungsbasierte Biamperometrie als neuartige Indikationsverfahren in der Maßanalyse

    OpenAIRE

    Sobieszuk, Grzegorz

    2013-01-01

    Titration is one of the most important methods among the quantitative analysis in the pharmaceutical area. The difficult task within this field has always been the determination of equivalent points of the chemical reactions. The long history of the titration encompasses the use of colorful plats extracts, synthetic chemical compounds such as phenolphthalein and the use of instrumental and electrochemical based methods such as potentiometry. The latest research filed concerning titration conc...

  10. Network-Based Meta-Analyses of Associations of Multiple Gene Expression Profiles with Bone Mineral Density Variations in Women

    OpenAIRE

    He, Hao; Cao, Shaolong; Niu, Tianhua; Zhou, Yu; Zhang, Lan; Zeng, Yong; Zhu, Wei; Wang, Yu-Ping; Deng, Hong-Wen

    2016-01-01

    Background Existing microarray studies of bone mineral density (BMD) have been critical for understanding the pathophysiology of osteoporosis, and have identified a number of candidate genes. However, these studies were limited by their relatively small sample sizes and were usually analyzed individually. Here, we propose a novel network-based meta-analysis approach that combines data across six microarray studies to identify functional modules from human protein-protein interaction (PPI) dat...

  11. Analysing value-based management as decision-making tool in a petrochemical company / Zonwabele Zweli Tom

    OpenAIRE

    Tom, Zonwabele Zweli

    2014-01-01

    The study aims to evaluate the understanding of value – based management (VBM) as a decision making tool, how it is embraced in all management levels and its impact on the performance of a petrochemical company. The application of VBM links business strategy, finance, performance management and management processes all together to create value. VBM is a powerful management framework with the aim to focus all managerial processes on shareholder value creation. It encourages employees at all...

  12. An automatic generation of non-uniform mesh for CFD analyses of image-based multiscale human airway models

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2014-11-01

    The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.

  13. Characteristics and molecular phylogeny of Fasciola flukes from Bangladesh, determined based on spermatogenesis and nuclear and mitochondrial DNA analyses.

    Science.gov (United States)

    Mohanta, Uday Kumar; Ichikawa-Seki, Madoka; Shoriki, Takuya; Katakura, Ken; Itagaki, Tadashi

    2014-07-01

    This study aimed to precisely discriminate Fasciola spp. based on DNA sequences of nuclear internal transcribed spacer 1 (ITS1) and mitochondrial nicotinamide adenine dinucleotide (NADH) dehydrogenase subunit 1 (nad1) gene. We collected 150 adult flukes from the bile ducts of cattle, buffaloes, sheep, and goats from six different regions of Bangladesh. Spermatogenic status was determined by analyzing stained seminal vesicles. The ITS1 types were analyzed using the polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) method. The nad1 haplotypes were identified based on PCR and direct sequencing and analyzed phylogenetically by comparing with nad1 haplotypes of Fasciola spp. from other Asian countries. Of the 127 aspermic flukes, 98 were identified as Fg type in ITS1, whereas 29 were identified as Fh/Fg type, indicating a combination of ITS1 sequences of Fasciola hepatica and Fasciola gigantica. All the 127 aspermic flukes showed Fsp-NDI-Bd11 in nad1 haplotype with nucleotide sequences identical to aspermic Fasciola sp. from Asian countries. Further, 20 spermic flukes were identified as F. gigantica based on their spermatogenic status and Fg type in ITS1. F. gigantica population was thought to be introduced into Bangladesh considerably earlier than the aspermic Fasciola sp. because 11 haplotypes with high haplotype diversity were detected from the F. gigantica population. However, three flukes from Bangladesh could not be precisely identified, because their spermatogenic status, ITS1 types, and nad1 haplotypes were ambiguous. Therefore, developing a robust method to distinguish aspermic Fasciola sp. from other Fasciola species is necessary in the future. PMID:24781019

  14. Energy and exergy based performance analyses of a solid oxide fuel cell integrated combined cycle power plant

    International Nuclear Information System (INIS)

    Highlights: • Energy and exergy based performance of SOFC integrated combined cycle is presented. • The system utilize the GT exhaust for fuel preheating, air preheating and steam generation. • The study considers the effect of additional fuel burning in the combustion chamber. • Detail parametric analysis is presented to show the effect of various operating parameters. • System performance is compared with another system with air recuperator before fuel recuperator. - Abstract: This article provides the energy and exergy based performance analysis of a solid oxide fuel cell (SOFC) – gas turbine (GT) – steam turbine (ST) combined cycle power plant. The system utilizes the GT exhaust heat for fuel and air preheating subsequently in a fuel recuperator (FR) and an air recuperator (AR) before finally producing steam in a heat recovery steam generator (HRSG) coupled with the ST cycle. It considers 30% external reforming in a pre-reformer (PR) by steam extracted from the bottoming ST plant. The study considers the effect of additional fuel burning in the combustion chamber (CC) as a means for increasing the net GT and ST power output. A detailed parametric analysis based on variation of compressor pressure ratio (CPR), fuel flow rate (FFR), air flow rate (AFR), current density, single level boiler pressure and ST inlet temperature (STIT) is also provided. Results indicate improved system performance at higher CPR. The optimum single level boiler pressure is found to be 40 bar with 50% additional fuel burning. Burning of additional fuel improves the GT and ST power output, however with reduction in the plant’s overall efficiency. Further comparison of performance with a similar other system where the AR is placed head of the FR indicates slightly better performance of the proposed system with FR ahead of AR (FRAOAR)

  15. Analyses on gravity variation before and after the Lijiang earthquake based on a finite rectangular dislocation model

    Institute of Scientific and Technical Information of China (English)

    燕乃玲; 李辉; 申重阳

    2003-01-01

    The methods were discussed to calculate the gravity variation due to crustal deformation based on a model of dislocation on a finite rectangular plane. Taking the Lijiang MS=7.0 earthquake as an example the calculating principle of fault parameters were determined, and the results were given. Of particular interests were the characteristics of the gravity variations in different dislocation types.With comparison between the calculated results and the practical measurements, it was found that the model could to some extent account for the observations. But it failed to give explanations to the more far spatial gravity variation.

  16. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    International Nuclear Information System (INIS)

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  17. Localisation of nursery areas based on comparative analyses of the horizontal and vertical distribution patterns of juvenile Baltic cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Lundgren, Bo; Kristensen, Kasper;

    2013-01-01

    Knowledge of the spatial distribution of juvenile cod is essential for obtaining precise recruitment data to conduct sustainable management of the eastern and western Baltic cod stocks. In this study, the horizontal and vertical distribution and density patterns of settled juvenile 0- and 1-group...... Baltic cod are determined, and their nursery areas are localised according to the environmental factors affecting them. Comparative statistical analyses of biological, hydrographic and hydroacoustic data are carried out based on standard ICES demersal trawl surveys and special integrated trawl and...... acoustic research surveys. Horizontal distribution maps for the 2001–2010 cohorts of juvenile cod are further generated by applying a statistical log-Gaussian Cox process model to the standard trawl survey data. The analyses indicate size-dependent horizontal and distinct vertical and diurnal distribution...

  18. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  19. A Microarray Based Genomic Hybridization Method for Identification of New Genes in Plants: Case Analyses of Arabidopsis and Oryza

    Institute of Scientific and Technical Information of China (English)

    Chuanzhu Fan; Maria D. Vibranovski; Ying Chen; Manyuan Long

    2007-01-01

    To systematically estimate the gene duplication events in closely related species, we have to use comparative genomic approaches, either through genomic sequence comparison or comparative genomic hybridization (CGH). Given the scarcity of complete genomic sequences of plant species, in the present study we adopted an array based CGH to investigate gene duplications in the genus Arabidopsis. Fragment genomic DNA from four species, namely Arabidopsis thaliana, A. lyrata subsp. lyrata, A. lyrata subsp. petraea, and A. halleri, was hybridized to Affymetrix (Santa Clara, CA, USA) tiling arrays that are designed from the genomic sequences of A. thaliana. Pairwise comparisons of signal intensity were made to infer the potential duplicated candidates along each phylo-genetic branch. Ninety-four potential candidates of gene duplication along the genus were identified. Among them, the majority (69 of 94) were A. thaliana lineage specific. This result indicates that the array based CGH approach may be used to identify candidates of duplication in other plant genera containing closely related species, such as Oryza, particularly for the AA genome species. We compared the degree of gene duplication through retrotransposon between O. sativa and A. thaliana and found a strikingly higher number of chimera retroposed genes in rice. The higher rate of gene duplication through retroposition and other mechanisms may indicate that the grass species is able to adapt to more diverse environments.

  20. Relevance Vector Machine Based Analyses of MRR and SR of Electrodischarge Machining Designed by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Kanhu Charan Nayak

    2013-01-01

    Full Text Available Relevance vector machine is found to be one of the best predictive models in the area of pattern recognition and machine learning. The important performance parameters such as the material removal rate (MRR and surface roughness (SR are influenced by various machining parameters, namely, discharge current (Ip, pulse on time (Ton, and duty cycle (tau in the electrodischarge machining process (EDM. In this communication, the MRR and SR of EN19 tool steel have been predicted using RVM model and the analysis of variance (ANOVA results were performed by implementing response surface methodology (RSM. The number of input parameters used for the RVM model is discharge current (Ip, pulse on time (Ton, and duty cycle (tau. At the output, the corresponding model predicts both MRR and SR. The performance of the model is determined by regression test error which can be obtained by comparing both predicted MRR and SR from model and experimental data is designed using central composite design (CCD based RSM. Our result shows that the regression error is minimized by using cubic kernel function based RVM model and the discharge current is found to be one of the most significant machining parameters for MRR and SR from ANOVA.

  1. Risk analyses of mortality due to malignant neoplasms among atomic bomb survivors in Hiroshima Prefecture based on ABS93D

    International Nuclear Information System (INIS)

    Risk of mortality due to malignant neoplasm was analyzed among atomic bomb survivors with ABS93D (Atomic Bomb Survivors 1993 Dose). The period subjected to analysis was that from Jan. 1, 1968 to Dec. 31, 1992. The number of the subjects was 47,204 in total who were registered as atomic bomb survivors in authors' facility data base essentially living in Hiroshima prefecture and giving the estimated dose of ABS93D or having been exposed at farther distance than 3 km from the explosion site without experience of existing in the city. They were divided in 2 groups of exposed (≥5 mGy of bone marrow dose) and non-exposed (<5 mGy) ones. The organ dose was the sum of doses of neutron and gamma ray based on ABS93D. The neoplasms analyzed were leukemia and cancers of esophagus, stomach, liver, pancreas, colon, lung, mammary gland and uterus. Risk ratio of the exposed group relative to non-exposed group per 1 Gy, the ratio according to the dose and the time change of the ratio were calculated and some of cancers gave statistically significant high risk in exposed group. (K.H.)

  2. Combined Statistical Analyses of Peptide Intensities and Peptide Occurrences Improves Identification of Significant Peptides from MS-based Proteomics Data

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; McCue, Lee Ann; Waters, Katrina M.; Matzke, Melissa M.; Jacobs, Jon M.; Metz, Thomas O.; Varnum, Susan M.; Pounds, Joel G.

    2010-11-01

    Liquid chromatography-mass spectrometry-based (LC-MS) proteomics uses peak intensities of proteolytic peptides to infer the differential abundance of peptides/proteins. However, substantial run-to-run variability in peptide intensities and observations (presence/absence) of peptides makes data analysis quite challenging. The missing abundance values in LC-MS proteomics data are difficult to address with traditional imputation-based approaches because the mechanisms by which data are missing are unknown a priori. Data can be missing due to random mechanisms such as experimental error, or non-random mechanisms such as a true biological effect. We present a statistical approach that uses a test of independence known as a G-test to test the null hypothesis of independence between the number of missing values and the experimental groups. We pair the G-test results evaluating independence of missing data (IMD) with a standard analysis of variance (ANOVA) that uses only means and variances computed from the observed data. Each peptide is therefore represented by two statistical confidence metrics, one for qualitative differential observation and one for quantitative differential intensity. We use two simulated and two real LC-MS datasets to demonstrate the robustness and sensitivity of the ANOVA-IMD approach for assigning confidence to peptides with significant differential abundance among experimental groups.

  3. Method for evaluating an extended Fault Tree to analyse the dependability of complex systems: Application to a satellite-based railway system

    International Nuclear Information System (INIS)

    Evaluating dependability of complex systems requires the evolution of the system states over time to be analysed. The problem is to develop modelling approaches that take adequately the evolution of the different operating and failed states of the system components into account. The Fault Tree (FT) is a well-known method that efficiently analyse the failure causes of a system and serves for reliability and availability evaluations. As FT is not adapted to dynamic systems with repairable multi-state components, extensions of FT (eFT) have been developed. However efficient quantitative evaluation processes of eFT are missing. Petri nets have the advantage of allowing such evaluation but their construction is difficult to manage and their simulation performances are unsatisfactory. Therefore, we propose in this paper a new powerful process to analyse quantitatively eFT. This is based on the use of PN method, which relies on the failed states highlighted by the eFT, combined with a new analytical modelling approach for critical events that depend on time duration. The performances of the new process are demonstrated through a theoretical example of eFT and the practical use of the method is shown on a satellite-based railway system. - Highlights: • New approach modelling critical events stemming from degraded-state duration. • Evaluating a repairable, multi-state and time duration dependent Fault Tree. • Practical solution for dependability analysis of a GNSS-based localisation. • Taking into account the local impacts on the GNSS-based localisation

  4. Methyl-binding domain protein-based DNA isolation from human blood serum combines DNA analyses and serum-autoantibody testing

    Directory of Open Access Journals (Sweden)

    Jungbauer Christof

    2011-09-01

    Full Text Available Abstract Background Circulating cell free DNA in serum as well as serum-autoantibodies and the serum proteome have great potential to contribute to early cancer diagnostics via non invasive blood tests. However, most DNA preparation protocols destroy the protein fraction and therefore do not allow subsequent protein analyses. In this study a novel approach based on methyl binding domain protein (MBD is described to overcome the technical difficulties of combining DNA and protein analysis out of one single serum sample. Methods Serum or plasma samples from 98 control individuals and 54 breast cancer patients were evaluated upon silica membrane- or MBD affinity-based DNA isolation via qPCR targeting potential DNA methylation markers as well as by protein-microarrays for tumor-autoantibody testing. Results In control individuals, an average DNA level of 22.8 ± 25.7 ng/ml was detected applying the silica membrane based protocol and 8.5 ± 7.5 ng/ml using the MBD-approach, both values strongly dependent on the serum sample preparation methods used. In contrast to malignant and benign tumor serum samples, cell free DNA concentrations were significantly elevated in sera of metastasizing breast cancer patients. Technical evaluation revealed that serum upon MBD-based DNA isolation is suitable for protein-array analyses when data are consistent to untreated serum samples. Conclusion MBD affinity purification allows DNA isolations under native conditions retaining the protein function, thus for example enabling combined analyses of DNA methylation and autoantigene-profiles from the same serum sample and thereby improving minimal invasive diagnostics.

  5. Mise au point de pains composites à base de melanges de farines de sorgho-ble et analyse texturale

    Directory of Open Access Journals (Sweden)

    Blecker C.

    1999-01-01

    Full Text Available Composite breads based on sorghum-wheat flour blends and textural analysis. Breadmaking properties of flour blends containing various levels of sorghum flour with wheat flour were investigated. Good quality bread (loaf volume, crumb structure, external appearance comparable to bread entirely made from wheat flour could be produced with a level of incorporation of sorghum flour up to 30/. Beyond this level, the loaf volume decreases and the internal characteristics of crumb are deteriorated. The staling of bread made with composite flour is also influenced by the level of substitution of sorghum flour: the more important this level, the higher the rate of crumb firming. Addition of 0.5/ emulsiflers, DATEM and SSL, to the sorghum flour blends improves the crumb characteristics and consequently delays the bread firmness.

  6. Chemical deamidation: a common pitfall in large-scale N-linked glycoproteomic mass spectrometry-based analyses

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Melo-Braga, Marcella Nunes; Engholm-Keller, Kasper; Parker, Benjamin L; Larsen, Martin R

    2012-01-01

    . We have evaluated this common large-scale N-linked glycoproteomic strategy and proved potential pitfalls using Escherichia coli as a model organism, since it lacks the N-glycosylation machinery found in mammalian systems and some pathogenic microbes. After isolation and proteolytic digestion of E....... coli membrane proteins, we investigated the presence of deamidated asparagines. The results show the presence of deamidated asparagines especially with close proximity to a glycine residue or other small amino acid, as previously described for spontaneous in vivo deamidation. Moreover, we have......-linked consensus sites based on common N-linked glycoproteomics strategies without proper control experiments. Beside showing the spontaneous deamidation we provide alternative methods for validation that should be used in such experiments....

  7. Uncertainty and sensitivity analyses applied to the DRAGONv4.05 code lattice calculations and based on JENDL-4 data

    International Nuclear Information System (INIS)

    Highlights: ► DRAGONv4.05 code uses its own microscopic cross-sections library format (DRAGLIB). ► DRAGLIB library based on JENDL-4 data (and its covariance’s) was statistically perturbed. ► Therefore, uncertainty analysis was performed on DRAGONv4.05 lattice calculations. ► Latin Hypercube Sampling was used to sample the uncertain input space. ► Sensitivity analysis of microscopic cross-sections complemented the study. - Abstract: In this paper, multi-group microscopic cross-section uncertainties are propagated through the DRAGON (Version 4.05) lattice code in order to perform uncertainty analysis on k∞ and 2-group homogenized macroscopic cross-sections. The test case corresponds to a 17 × 17 PWR fuel assembly segment without poison at full power conditions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library was based on JENDL-4 data, because JENDL-4 contains a large amount of isotopic covariance matrices among the different major nuclear data libraries. Thus, multi-group uncertainty was computed for the different isotopic reactions by means of ERRORRJ. The preferred sampling strategy for the current study corresponds to the quasi-random Latin Hypercube Sampling (LHS). This technique allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. In order to prove this, the uncertain input space was re-sampled 10 times, and it is shown that the variability of the replicated mean of the different k∞ samples is much less for the LHS case, than for the SRS case. The uncertainty assessment of the output space should be based on the theory of non-parametric multivariate tolerance limits, due to the fact that k∞ and some of the macroscopic cross-sections are

  8. Degradation of dental ZrO2-based materials after hydrothermal fatigue. Part I: XRD, XRF, and FESEM analyses.

    Science.gov (United States)

    Perdigão, Jorge; Pinto, Ana M; Monteiro, Regina C C; Braz Fernandes, Francisco M; Laranjeira, Pedro; Veiga, João P

    2012-01-01

    The aim was to investigate the effect of simulated low-temperature degradation (s-LTD) and hydrothermal fatigue on the degradation of three ZrO(2)-based dental materials. Lava, IPS, and NanoZr discs were randomly assigned to (1) Control-Storage in distilled water at 37°C; (2) Aging at 134°C for 5 h (s-LTD); (3) Thermocycling in saliva for 30,000 cycles (TF). XRD revealed that ZrO(2) m phase was identified in all groups but TF increased the m phase only for Lava. Under the FESEM, Lava showed no alterations under s-LTD, but displayed corrosion areas up to 60 µm wide after TF. We conclude that TF accelerated the degradation of Lava through an increase in the m phase and grain pull-out from the material surface. PMID:22447060

  9. Analyses on schedule-cost coefficient correlation of spaceflight project based on historical statistics and its application

    Institute of Scientific and Technical Information of China (English)

    Liu Yanqiong; Chen Yingwu

    2006-01-01

    When analyze the uncertainty of the cost and the schedule of the spaceflight project, it is needed to know the value of the schedule-cost correlation coefficient. This paper deduces the schedule distribution, considering the effect of the cost, and proposes the estimation formula of the correlation coefficient between the ln(schedule) and the cost. On the basis of the fact and Taylor expansion, the relation expression between the schedule-cost correlation coefficient and the ln-schedule-cost correlation coefficient is put forward. By analyzing the value features of the estimation formula of the ln-schedule-cost correlation coefficient, the general rules are proposed to ascertain the value of the schedule-cost correlation coefficient. An example is given to demonstrate how to approximately amend the schedule-cost correlation coefficient based on the historical statistics, which reveals the traditional assigned value is inaccurate. The universality of this estimation method is analyzed.

  10. Flood Mapping and Flood Dynamics of the Mekong Delta: ENVISAT-ASAR-WSM Based Time Series Analyses

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2013-02-01

    Full Text Available Satellite remote sensing is a valuable tool for monitoring flooding. Microwave sensors are especially appropriate instruments, as they allow the differentiation of inundated from non-inundated areas, regardless of levels of solar illumination or frequency of cloud cover in regions experiencing substantial rainy seasons. In the current study we present the longest synthetic aperture radar-based time series of flood and inundation information derived for the Mekong Delta that has been analyzed for this region so far. We employed overall 60 Envisat ASAR Wide Swath Mode data sets at a spatial resolution of 150 meters acquired during the years 2007–2011 to facilitate a thorough understanding of the flood regime in the Mekong Delta. The Mekong Delta in southern Vietnam comprises 13 provinces and is home to 18 million inhabitants. Extreme dry seasons from late December to May and wet seasons from June to December characterize people’s rural life. In this study, we show which areas of the delta are frequently affected by floods and which regions remain dry all year round. Furthermore, we present which areas are flooded at which frequency and elucidate the patterns of flood progression over the course of the rainy season. In this context, we also examine the impact of dykes on floodwater emergence and assess the relationship between retrieved flood occurrence patterns and land use. In addition, the advantages and shortcomings of ENVISAT ASAR-WSM based flood mapping are discussed. The results contribute to a comprehensive understanding of Mekong Delta flood dynamics in an environment where the flow regime is influenced by the Mekong River, overland water-flow, anthropogenic floodwater control, as well as the tides.

  11. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove;

    2007-01-01

    The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  12. Lack of Spatial Subdivision for the Snapper Lutjanus purpureus (Lutjanidae - Perciformes) from Southwest Atlantic Based on Multi-Locus Analyses.

    Science.gov (United States)

    da Silva, Raimundo; Sampaio, Iracilda; Schneider, Horacio; Gomes, Grazielle

    2016-01-01

    The Caribbean snapper Lutjanus purpureus is a marine species fish commonly found associated with rocky seabeds and is widely distributed along of Western Atlantic. Data on stock delineation and stock recognition are essential for establishing conservation measures for commercially fished species. However, few studies have investigated the population genetic structure of this economically valuable species, and previous studies (based on only a portion of the mitochondrial DNA) provide an incomplete picture. The present study used a multi-locus approach (12 segments of mitochondrial and nuclear DNA) to elucidate the levels of genetic diversity and genetic connectivity of L. purpureus populations and their demographic history. L. purpureus has high levels of genetic diversity, which probably implies in high effective population sizes values for the species. The data show that this species is genetically homogeneous throughout the geographic region analyzed, most likely as a result of dispersal during larval phase. Regarding demographic history, a historical population growth event occurred, likely due to sea level changes during the Pleistocene. PMID:27556738

  13. Purity analyses of high-purity organic compounds with nitroxyl radicals based on the Curie–Weiss law

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Nobuhiro, E-mail: nobu-matsumoto@aist.go.jp; Shimosaka, Takuya [National Metrology Institute of Japan (NMIJ), National Institute of Advanced Industrial Science and Technology (AIST), AIST Central-3, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8563 (Japan)

    2015-05-07

    This work reports an attempt to quantify the purities of powders of high-purity organic compounds with stable nitroxyl radicals (namely, 2,2,6,6-tetramethylpiperidine 1-oxyl (TEMPO), 1-oxyl-2,2,6,6-tetramethyl-4-hydroxypiperidine (TEMPOL), and 4-hydroxy-2,2,6,6-tetramethylpiperidine 1-oxyl benzoate (4-hydroxy-TEMPO benzoate)) in terms of mass fractions by using our “effective magnetic moment method,” which is based on both the Curie–Weiss law and a fundamental equation of electron paramagnetic resonance (ESR). The temperature dependence of the magnetic moment resulting from the radicals was measured with a superconducting quantum interference device magnetometer. The g value for each compound was measured with an X-band ESR spectrometer. The results of the purities were (0.998 ± 0.064) kg kg{sup −1} for TEMPO, (1.019 ± 0.040) kg kg{sup −1} for TEMPOL, and (1.001 ± 0.048) kg kg{sup −1} for 4-hydroxy-TEMPO benzoate. These results demonstrate that this analytical method as a future candidate of potential primary direct method can measure the purities with expanded uncertainties of approximately 5%.

  14. Modelling software failures of digital I and C in probabilistic safety analyses based on the TELEPERM registered XS operating experience

    International Nuclear Information System (INIS)

    Digital instrumentation and control (I and C) systems appear as upgrades in existing nuclear power plants (NPPs) and in new plant designs. In order to assess the impact of digital system failures, quantifiable reliability models are needed along with data for digital systems that are compatible with existing probabilistic safety assessments (PSA). The paper focuses on the modelling of software failures of digital I and C systems in probabilistic assessments. An analysis of software faults, failures and effects is presented to derive relevant failure modes of system and application software for the PSA. The estimations of software failure probabilities are based on an analysis of the operating experience of TELEPERM registered XS (TXS). For the assessment of application software failures the analysis combines the use of the TXS operating experience at an application function level combined with conservative engineering judgments. Failure probabilities to actuate on demand and of spurious actuation of typical reactor protection application are estimated. Moreover, the paper gives guidelines for the modelling of software failures in the PSA. The strategy presented in this paper is generic and can be applied to different software platforms and their applications.

  15. Considerations on the age of the Bambui Group (MG, Brazil) based on isotopic analyses of Sr and Pb

    International Nuclear Information System (INIS)

    Based on radiometric ages, the Bambui Group deposition time is related to the end of the Precambrian. However, the ages determined and released through scientific magazines are mot in agreement (600-1350 m.y.) and many doubts about the geochrological picture of this important lithostratigraphic unit remained for a long time. As a result of the work developed by Metamig, CPGeo (IG-USP) and IPEN (SP), Rb/Sr and Pb/Pb isotopic determinations were done on 31 rocks samples and 17 galenas collected from the Bambui Basin distributed in Minas Gerais State. The Rb/Sr ages of 590 m.y. for Pirapora Formation, 620 m.y. for Tres Marias Formation, and 640 m.y. for the Paraopeba Formation situated in the stable area are linked to sedimentation processes. In the Paracatu region the age of 680 m.y. found for the Paraopeba Formation is related to metamorphic events. The lead isotopic ratios from the galenas suggest an isotopic evolution in two stages. The first ended with the lead separation from the mantle and its incorporation to the crust during events of the Transamazonic Cycle. The second ended when the lead were incorporated to the galenas and seems to be related to one or more events of the Brazilian Cycle. (Author)

  16. Preliminary assessment of late quaternary vegetation and climate of southeastern Utah based on analyses of packrat middens

    International Nuclear Information System (INIS)

    Packrat midden sequences from two caves (elevations 1585 and 2195 m; 5200 and 7200 ft) southwest of the Abajo Mountains in southeast Utah record vegetation changes that are attributed to climatic changes occurring during the last 13,000 years. These data are useful in assessing potential future climates at proposed nuclear waste sites in the area. Paleoclimates are reconstructed by defining modern elevational analogs for the vegetation assemblages identified in the middens. Based on the midden record, a climate most extreme from the present occurred prior to approximately 10,000 years before present (BP), when mean annual temperature was probably 3 to 4C (5.5 to 7F) cooler than present. However, cooling could not have exceeded 5C (9F) at 1585 m (5200 ft). Accompanying mean annual precipitation is estimated to have been from 35 to 140% greater than at present, with rainfall concentrated in the winter months. Vegetational changes beginning approximately 10,000 years BP are attributed to increased summer and mean annual temperatures, a decreasing frequency of spring freezes, and a shift from winter- to summer-dominant rainfall. Greater effective moisture than present is inferred at both cave sites from approximately 8000 to 4000 years BP. Modern flora was present at both sites by about 2000 years BP

  17. NMR-based metabonomic analyses of the effects of ultrasmall superparamagnetic particles of iron oxide (USPIO) on macrophage metabolism

    International Nuclear Information System (INIS)

    The metabonomic changes in murine RAW264.7 macrophage-like cell line induced by ultrasmall superparamagnetic particles of iron oxides (USPIO) have been investigated, by analyzing both the cells and culture media, using high-resolution NMR in conjunction with multivariate statistical methods. Upon treatment with USPIO, macrophage cells showed a significant decrease in the levels of triglycerides, essential amino acids such as valine, isoleucine, and choline metabolites together with an increase of glycerophospholipids, tyrosine, phenylalanine, lysine, glycine, and glutamate. Such cellular responses to USPIO were also detectable in compositional changes of cell media, showing an obvious depletion of the primary nutrition molecules, such as glucose and amino acids and the production of end-products of glycolysis, such as pyruvate, acetate, and lactate and intermediates of TCA cycle such as succinate and citrate. At 48 h treatment, there was a differential response to incubation with USPIO in both cell metabonome and medium components, indicating that USPIO are phagocytosed and released by macrophages. Furthermore, information on cell membrane modification can be derived from the changes in choline-like metabolites. These results not only suggest that NMR-based metabonomic methods have sufficient sensitivity to identify the metabolic consequences of murine RAW264.7 macrophage-like cell line response to USPIO in vitro, but also provide useful information on the effects of USPIO on cellular metabolism.

  18. Recombination structure and genetic relatedness among members of the family Bromoviridae based on their RNAs 1 and 2 sequence analyses.

    Science.gov (United States)

    Boulila, Moncef

    2009-06-01

    In determining putative recombination events and their evolution rates in the RNAs 1 and 2 of currently the known members of the family Bromoviridae, a detailed study comprising 107 accessions retrieved from the international databases, has been carried out by using RECCO and RDP v3.31beta algorithms. These programs allowed the detection of potential recombination sites in all the five virus genera composing the family Bromoviridae with various degrees of consistency. The RNAs 1 and 2 showed inferred phylogenies fully congruent and clearly delineated five clusters representing the five studied virus genera. In this respect, we proposed to classify the Ilarviruses in three distinct subgroups instead of 10 as mentioned in several reports of the International Committee on Taxonomy of Viruses where its suggestions were based on antigenic differences. Moreover, we confirmed that Alfalfa mosaic virus should be considered as a component of the Ilarvirus genus instead of being the unique representative of Alfamovirus genus. In addition, Pelargonium zonate spot and Olive latent 2 viruses fully deserve their affiliation to the family Bromoviridae. PMID:19255837

  19. NMR-based metabonomic analyses of the effects of ultrasmall superparamagnetic particles of iron oxide (USPIO) on macrophage metabolism

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianghua [Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics (China); Zhao Jing [China Institute of Atomic Energy (China); Hao Fuhua [Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics (China); Chen Chang [Institute of Biophysics, The Chinese Academy of Sciences, National Laboratory of Biomacromolecules (China); Bhakoo, Kishore [Singapore Bioimaging Consortium Agency for Science, Technology and Research (A-STAR) (Singapore); Tang, Huiru, E-mail: huiru.tang@wipm.ac.cn [Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics (China)

    2011-05-15

    The metabonomic changes in murine RAW264.7 macrophage-like cell line induced by ultrasmall superparamagnetic particles of iron oxides (USPIO) have been investigated, by analyzing both the cells and culture media, using high-resolution NMR in conjunction with multivariate statistical methods. Upon treatment with USPIO, macrophage cells showed a significant decrease in the levels of triglycerides, essential amino acids such as valine, isoleucine, and choline metabolites together with an increase of glycerophospholipids, tyrosine, phenylalanine, lysine, glycine, and glutamate. Such cellular responses to USPIO were also detectable in compositional changes of cell media, showing an obvious depletion of the primary nutrition molecules, such as glucose and amino acids and the production of end-products of glycolysis, such as pyruvate, acetate, and lactate and intermediates of TCA cycle such as succinate and citrate. At 48 h treatment, there was a differential response to incubation with USPIO in both cell metabonome and medium components, indicating that USPIO are phagocytosed and released by macrophages. Furthermore, information on cell membrane modification can be derived from the changes in choline-like metabolites. These results not only suggest that NMR-based metabonomic methods have sufficient sensitivity to identify the metabolic consequences of murine RAW264.7 macrophage-like cell line response to USPIO in vitro, but also provide useful information on the effects of USPIO on cellular metabolism.

  20. Estimation of the Whitefly Bemisia tabaci Genome Size Based on k-mer and Flow Cytometric Analyses.

    Science.gov (United States)

    Chen, Wenbo; Hasegawa, Daniel K; Arumuganathan, Kathiravetpillai; Simmons, Alvin M; Wintermantel, William M; Fei, Zhangjun; Ling, Kai-Shu

    2015-01-01

    Whiteflies of the Bemisia tabaci (Hemiptera: Aleyrodidae) cryptic species complex are among the most important agricultural insect pests in the world. These phloem-feeding insects can colonize over 1000 species of plants worldwide and inflict severe economic losses to crops, mainly through the transmission of pathogenic viruses. Surprisingly, there is very little genomic information about whiteflies. As a starting point to genome sequencing, we report a new estimation of the genome size of the B. tabaci B biotype or Middle East-Asia Minor 1 (MEAM1) population. Using an isogenic whitefly colony with over 6500 haploid male individuals for genomic DNA, three paired-end genomic libraries with insert sizes of ~300 bp, 500 bp and 1 Kb were constructed and sequenced on an Illumina HiSeq 2500 system. A total of ~50 billion base pairs of sequences were obtained from each library. K-mer analysis using these sequences revealed that the genome size of the whitefly was ~682.3 Mb. In addition, the flow cytometric analysis estimated the haploid genome size of the whitefly to be ~690 Mb. Considering the congruency between both estimation methods, we predict the haploid genome size of B. tabaci MEAM1 to be ~680-690 Mb. Our data provide a baseline for ongoing efforts to assemble and annotate the B. tabaci genome. PMID:26463411

  1. Estimation of the Whitefly Bemisia tabaci Genome Size Based on k-mer and Flow Cytometric Analyses

    Directory of Open Access Journals (Sweden)

    Wenbo Chen

    2015-07-01

    Full Text Available Whiteflies of the Bemisia tabaci (Hemiptera: Aleyrodidae cryptic species complex are among the most important agricultural insect pests in the world. These phloem-feeding insects can colonize over 1000 species of plants worldwide and inflict severe economic losses to crops, mainly through the transmission of pathogenic viruses. Surprisingly, there is very little genomic information about whiteflies. As a starting point to genome sequencing, we report a new estimation of the genome size of the B. tabaci B biotype or Middle East-Asia Minor 1 (MEAM1 population. Using an isogenic whitefly colony with over 6500 haploid male individuals for genomic DNA, three paired-end genomic libraries with insert sizes of ~300 bp, 500 bp and 1 Kb were constructed and sequenced on an Illumina HiSeq 2500 system. A total of ~50 billion base pairs of sequences were obtained from each library. K-mer analysis using these sequences revealed that the genome size of the whitefly was ~682.3 Mb. In addition, the flow cytometric analysis estimated the haploid genome size of the whitefly to be ~690 Mb. Considering the congruency between both estimation methods, we predict the haploid genome size of B. tabaci MEAM1 to be ~680–690 Mb. Our data provide a baseline for ongoing efforts to assemble and annotate the B. tabaci genome.

  2. Partial reconfiguration of a peripheral in an FPGA-based SoC to analyse performance-area behaviour

    Science.gov (United States)

    Cardona, Andres; Guo, Yi; Ferrer, Carles

    2011-05-01

    Systems on Chip (SoC) are present in a wide range of applications. This diversity in addition with the quantity of critical variables involved in their design process becomes it as a great challenging topic. FPGAs have consolidated as a preferred device to develop and prototype SoCs, and consequently Partial Reconfiguration (PR) has gained importance in this approach. Through PR it is possible to have a section of the FPGA operating, while other section is disabled and partially reconfigured to provide new functionality. In this way hardware resources can be time-multiplexed and therefore it is possible to reduce size, cost and power. In this case we focus on the implementation of a SoC, in an FPGA-based board, with one of its peripherals being a reconfigurable partition (RP). Inside this RP different hardware modules defined as reconfigurable modules (RM) can be configured. Thus, the system is suitable to have different hardware configurations depending on the application needs and FPGA limitations, while the rest of the system continues working. To this end a MicroBlaze soft-core processor is used in the system design and a Virtex-5 FPGA board is utilized to its implementations. A remote sensing application is used to explore the capabilities of this approach. Identifying the section(s) of the application suitable of being time-shared it is possible to define the RMs to place inside the RP. Different configurations were carried out and measurements of area were taken. Preliminary results of the performance-area utilisation are presented to validate the improvement in flexibility and resource usage.

  3. Semiclathrate-based CO2 capture from flue gas mixtures: An experimental approach with thermodynamic and Raman spectroscopic analyses

    International Nuclear Information System (INIS)

    Highlights: • Semiclathrates were used for post-combustion CO2 capture. • The highest gas uptake was observed for the TBAC (3.3 mol%) semiclathrate. • CO2 was enriched to approximately 60% in the semiclathrate phase. • Gas enclathration in the semiclathrate lattices was confirmed with Raman spectroscopy. - Abstract: Semiclathrate-based CO2 capture from flue gas in the presence of various quaternary ammonium salts (QASs) such as tetra-n-butyl ammonium bromide (TBAB), tetra-n-butyl ammonium chloride (TBAC), and tetra-n-butyl ammonium fluoride (TBAF) was investigated with a primary focus on the thermodynamic, kinetic, and spectroscopic aspects. The thermodynamic stability of the CO2 (20%) + N2 (80%) + QAS semiclathrates was examined with an isochoric method using a high pressure reactor as well as with dissociation enthalpy measurement using a high pressure micro-differential scanning calorimeter (HP μ-DSC). The TBAF semiclathrate with CO2 (20%) + N2 (80%) showed the most significant equilibrium pressure reduction at a specified temperature. However, the TBAC semiclathrate had the highest gas uptake and steepest CO2 concentration change in the vapor phase, which indicates the largest gas storage capacity for CO2 capture. CO2 was observed to be preferentially captured and enriched to approximately 60% in the semiclathrate phase. The CO2 selectivity was independent of the type of QASs used. The Raman spectroscopic results revealed that both CO2 and N2 are enclathrated in the small cages of the QAS semiclathrates and that the enclathration of guest gas molecules does not change the structure of the semiclathrates

  4. The generic MESSy submodel TENDENCY (v1.0 for process-based analyses in Earth system models

    Directory of Open Access Journals (Sweden)

    R. Eichinger

    2014-07-01

    Full Text Available The tendencies of prognostic variables in Earth system models are usually only accessible, e.g. for output, as a sum over all physical, dynamical and chemical processes at the end of one time integration step. Information about the contribution of individual processes to the total tendency is lost, if no special precautions are implemented. The knowledge on individual contributions, however, can be of importance to track down specific mechanisms in the model system. We present the new MESSy (Modular Earth Submodel System infrastructure submodel TENDENCY and use it exemplarily within the EMAC (ECHAM/MESSy Atmospheric Chemistry model to trace process-based tendencies of prognostic variables. The main idea is the outsourcing of the tendency accounting for the state variables from the process operators (submodels to the TENDENCY submodel itself. In this way, a record of the tendencies of all process–prognostic variable pairs can be stored. The selection of these pairs can be specified by the user, tailor-made for the desired application, in order to minimise memory requirements. Moreover, a standard interface allows the access to the individual process tendencies by other submodels, e.g. for on-line diagnostics or for additional parameterisations, which depend on individual process tendencies. An optional closure test assures the correct treatment of tendency accounting in all submodels and thus serves to reduce the model's susceptibility. TENDENCY is independent of the time integration scheme and therefore the concept is applicable to other model systems as well. Test simulations with TENDENCY show an increase of computing time for the EMAC model (in a setup without atmospheric chemistry of 1.8 ± 1% due to the additional subroutine calls when using TENDENCY. Exemplary results reveal the dissolving mechanisms of the stratospheric tape recorder signal in height over time. The separation of the tendency of the specific humidity into the respective

  5. LC-DAD-UV and LC-ESI-MS-based Analyses, Antioxidant Capacity, and Antimicrobial Activity of a Polar Fraction from Iryanthera ulei Leaves

    OpenAIRE

    Freddy A. Bernal; Luis E. Cuca-Suárez; Yamaguchi, Lydia F.; Ericsson D. Coy-Barrera

    2013-01-01

    LC-DAD-UV and LC-ESI-MS-based analyses were performed in order to chemically characterize a phenol-enriched fraction obtained from Iryanthera ulei leaves-derived ethanol extract. Eight glycosylated flavonoids, two free-flavonoids and two neolignans were detected to be part of its isopropyl acetate-soluble (iPS) fraction. Presence of afzelin 1 was confirmed by isolation. Total Phenolic (TPC) and Total Flavonoid Contents (TFC), Antioxidant Capacity (DPPH, ABTS • + , and FRAP methods), as well a...

  6. LC-DAD-UV and LC-ESI-MS-based Analyses, Antioxidant Capacity, and Antimicrobial Activity of a Polar Fraction from Iryanthera ulei Leaves

    Directory of Open Access Journals (Sweden)

    Freddy A. Bernal

    2013-03-01

    Full Text Available LC-DAD-UV and LC-ESI-MS-based analyses were performed in order to chemically characterize a phenol-enriched fraction obtained from Iryanthera ulei leaves-derived ethanol extract. Eight glycosylated flavonoids, two free-flavonoids and two neolignans were detected to be part of its isopropyl acetate-soluble (iPS fraction. Presence of afzelin 1 was confirmed by isolation. Total Phenolic (TPC and Total Flavonoid Contents (TFC, Antioxidant Capacity (DPPH, ABTS • + , and FRAP methods, as well as antimicrobial activity against five strains were determined.

  7. Boron analyses in the reactor coolant system of French PWR by acid-base titration ([B]) and ICP-MS (10B atomic %): key to NPP safety

    International Nuclear Information System (INIS)

    Boron is widely used by Nuclear Power Plants and especially by EDF Pressurized Water Reactors to ensure the control of the neutron rate in the reactor coolant system and, by this way, the fission reaction. The Boron analysis is thus a major factor of safety which enables operators to guarantee the permanent control of the reactor. Two kinds of analyses carried out by EDF on the Boron species, recently upgraded regarding new method validation standards and developed to enhance the measurement quality by reducing uncertainties, will be discussed in this topic: Acid-Base titration of Boron and Boron isotopic composition by Inductively Coupled Plasma Mass Spectrometer - ICP MS. (authors)

  8. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2012-09-01

    Full Text Available The "European Organisation for the Exploitation of Meteorological Satellites" (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean-Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operations capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, kriging, has been developed and applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA-JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE to both reanalysis is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated in JMA and all

  9. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder

    2013-03-01

    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  10. Analyses of atmospheric pollutants in Hong Kong and the Pearl River Delta by observation-based methods

    Science.gov (United States)

    Yuan, Zibing

    Despite continuous efforts paid on pollution control by the Hong Kong (HK) environmental authorities in the past decade, the air pollution in HK has been deteriorating in recent years. In this thesis work a variety of observation-based approaches were applied to analyze the air pollutant monitoring data in HK and the Pearl River Delta (PRD) area. The two major pollutants of interest are ozone and respirable suspended particulate (RSP, or PM10), which exceed the Air Quality Objective more frequently. Receptor models serve as powerful tools for source identification, estimation of source contributions, and source localization when incorporated with wind profiles. This thesis work presents the first-ever application of two advanced receptor-models, positive matrix factorization (PMT) and Unmix, on the PM10 and VOCs speciation data in HK. Speciated PM10 data were collected from a monitoring network in HK between July-1998 and Dec-2005. Seven and nine sources were identified by Unmix and PMF10, respectively. Overall, secondary sulfate and vehicle emissions gave the largest contribution to PM10 (27% each), followed by biomass burning/waste incineration (13%) and secondary nitrate (11%). Sources were classified as local and regional based on its seasonal and spatial variations as well as source directional analysis. Regional sources accounted for about 56% of the ambient PM10 mass on an annual basis, and even higher (67%) during winter. Regional contributions also showed an increasing trend, with their annual averaged fraction rising from 53% in 1999 to 64% in 2005. The particulate pollution in HK is therefore sensitive to the regional influence and regional air quality management strategies are crucial in reducing PM level in HK. On the other hand, many species with significant adverse health impacts were produced locally. Local control measures should be strengthened for better protection of public health. Secondary organic carbon (SOC) could be a significant portion of

  11. Late Frasnian-Famennian climates based on palynomorph analyses and the question of the Late Devonian glaciations

    Science.gov (United States)

    Streel, Maurice; Caputo, Mário V.; Loboziak, Stanislas; Melo, José Henrique G.

    2000-11-01

    Palynomorph distribution in Euramerica and western Gondwana, from the Latest Givetian to the Latest Famennian, may be explained, to some extent, by climatic changes. Detailed miospore stratigraphy dates accurately the successive steps of these changes. Interpretation is built on three postulates which are discussed: Euramerica at slightly lower latitudes than generally accepted by most paleomagnetic reconstructions; a conodont time-scale accepted as the most used available subdivision of time; and Late Devonian sea-level fluctuations mainly governed by glacio-eustasy. The Frasnian-Famennian timescale is also evaluated. The comparison, based on conodont correlations, between Givetian and most of the Frasnian miospore assemblages from, respectively, northern and southern Euramerica demonstrates a high taxonomic diversity in the equatorial belt and much difference between supposed equatorial and (sub) tropical vegetations. On the contrary, a similar vegetation pattern and therefore probably compatible climatic conditions were present from tropical to subpolar areas. A rather hot climate culminated during the Latest Frasnian when equatorial miospore assemblages reached their maximum width. The miospore diversity shows also a rather clear global Late Frasnian minimum which is also recorded during the Early and Middle Famennian but only in low latitude regions while, in high latitude, very cold climates without perennial snow may explain the scarcity of miospores and so, of vegetation. The Early and Middle Famennian conspicuous latitudinal gradient of the vegetation seems to attenuate towards the Late and Latest Famennian but this might be above all the result of the development, of cosmopolitan coastal lowland vegetations (downstream swamps) depending more on the moisture and equable local microclimates than on the probably adverse climates of distant hinterland areas. During that time, periods of cold climate without perennial snow cover and with rare vegetation may

  12. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms

    OpenAIRE

    Pots, Wendy T.M.; Trompetter, Hester R.; Schreurs, Karlein M. G.; Bohlmeijer, Ernst T.

    2016-01-01

    Background Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Methods Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of t...

  13. Biofunctionalized magnetic nanospheres-based cell sorting strategy for efficient isolation, detection and subtype analyses of heterogeneous circulating hepatocellular carcinoma cells.

    Science.gov (United States)

    Chen, Lan; Wu, Ling-Ling; Zhang, Zhi-Ling; Hu, Jiao; Tang, Man; Qi, Chu-Bo; Li, Na; Pang, Dai-Wen

    2016-11-15

    Hepatocellular carcinoma (HCC) is an awful threat to human health. Early-stage HCC may be detected by isolation of circulating tumor cells (CTCs) from peripheral blood samples, which is beneficial to the diagnosis and therapy. However, the extreme rarity and high heterogeneity of HCC CTCs have been restricting the relevant research. To achieve an efficient isolation, reliable detection and subtype analyses of heterogeneous HCC CTCs, herein, we present a cell sorting strategy based on anti-CD45 antibody-modified magnetic nanospheres. By this strategy, leukocyte depletion efficiency was up to 99.9% within 30min in mimic clinical samples, and the purity of the spiked HCC cells was improved 265-317-fold. Besides, the isolated HCC cells remained viable at 92.3% and could be directly recultured. Moreover, coupling the convenient, fast and effective cell sorting strategy with specific ICC identification via biomarkers AFP and GPC3, HCC CTCs were detectable in peripheral blood samples, showing the potential for HCC CTC detection in clinic. Notably, this immunomagnetic cell sorting strategy enabled isolating more heterogeneous HCC cells compared with the established EpCAM-based methods, and further achieved characterization of three different CTC subtypes from one clinical HCC blood sample, which may assist clinical HCC analyses such as prognosis or personalized treatment. PMID:27240010

  14. Discriminant analyses of stock prices by using multifractality of time series generated via multi-agent systems and interpolation based on wavelet transforms

    Science.gov (United States)

    Tokinaga, Shozo; Ikeda, Yoshikazu

    In investments, it is not easy to identify traders'behavior from stock prices, and agent systems may help us. This paper deals with discriminant analyses of stock prices using multifractality of time series generated via multi-agent systems and interpolation based on Wavelet Transforms. We assume five types of agents where a part of agents prefer forecast equations or production rules. Then, it is shown that the time series of artificial stock price reveals as a multifractal time series whose features are defined by the Hausedorff dimension D(h). As a result, we see the relationship between the reliability (reproducibility) of multifractality and D(h) under sufficient number of time series data. However, generally we need sufficient samples to estimate D(h), then we use interpolations of multifractal times series based on the Wavelet Transform.

  15. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  16. A comparison of geostatistically-based inverse techniques for use in performance assessment analyses at the WIPP site results from the Test Case No. 1

    International Nuclear Information System (INIS)

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ''Geostatistics Test Problem'' is being developed to evaluate a number of inverse techniques that may be used for Dow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data sct; the intent is for the ensemble of these data sets span the range of possible conceptual models of groundwater now at the WIPP site. This paper describes the results from Test Case No. 1. Of the five techniques compared, those based on the linearized form of the groundwater flow equation exhibited less bias and less spread in their GWTT distribution functions; the semi-analytical method had the least bias. While the results are not sufficient to make generalizations about which techniques may be better suited for the WIPP PA (only one test case has been exercised), analyses of the data from this test case provides some indication about the relative importance of other aspects of the flow modeling (besides inverse method or geostatistical approach) in PA. Then ancillary analyses examine the effect of gridding an the effect of boundary conditions on the groundwater travel time estimates

  17. How and for whom does web-based acceptance and commitment therapy work? Mediation and moderation analyses of web-based ACT for depressive symptoms

    NARCIS (Netherlands)

    Pots, Wendy T.M.; Trompetter, Hester R.; Schreurs, Karlein M.G.; Bohlmeijer, Ernst T.

    2016-01-01

    Background Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of ch

  18. Study of oxidation behaviour of Zr-based bulk amorphous alloy Zr65Cu17.5Ni10Al7.5 by thermogravimetric analyser

    Indian Academy of Sciences (India)

    A Dhawan; K Raetzke; F Faupel; S K Sharma

    2001-06-01

    The oxidation behaviour of Zr-based bulk amorphous alloy Zr65Cu17.5Ni10Al7.5 has been studied in air environment at various temperatures in the temperature range 591–684 K using a thermogravimetric analyser (TGA). The oxidation kinetics of the alloy in the amorphous phase obeys the parabolic rate law for oxidation in the temperature range 591–664 K. The values of the activation energy and pre-factor as calculated from the Arrhenius temperature dependence of the rate constants have been found to be 1.80 eV and 2.12 × 109 g cm–2.sec–1/2, respectively.

  19. An artificial neural network-based response surface method for reliability analyses of c-φ slopes with spatially variable soil

    Science.gov (United States)

    Shu, Su-xun; Gong, Wen-hui

    2016-03-01

    This paper presents an artificial neural network (ANN)-based response surface method that can be used to predict the failure probability of c-φ slopes with spatially variable soil. In this method, the Latin hypercube sampling technique is adopted to generate input datasets for establishing an ANN model; the random finite element method is then utilized to calculate the corresponding output datasets considering the spatial variability of soil properties; and finally, an ANN model is trained to construct the response surface of failure probability and obtain an approximate function that incorporates the relevant variables. The results of the illustrated example indicate that the proposed method provides credible and accurate estimations of failure probability. As a result, the obtained approximate function can be used as an alternative to the specific analysis process in c-φ slope reliability analyses.

  20. Comparison of in vitro breast cancer visibility in analyser-based computed tomography with histopathology, mammography, computed tomography and magnetic resonance imaging.

    Science.gov (United States)

    Keyriläinen, Jani; Fernández, Manuel; Bravin, Alberto; Karjalainen-Lindsberg, Marja Liisa; Leidenius, Marjut; von Smitten, Karl; Tenhunen, Mikko; Kangasmäki, Aki; Sipilä, Petri; Nemoz, Christian; Virkkunen, Pekka; Suortti, Pekka

    2011-09-01

    High-resolution analyser-based X-ray imaging computed tomography (HR ABI-CT) findings on in vitro human breast cancer are compared with histopathology, mammography, computed tomography (CT) and magnetic resonance imaging. The HR ABI-CT images provided significantly better low-contrast visibility compared with the standard radiological images. Fine cancer structures indistinguishable and superimposed in mammograms were seen, and could be matched with the histopathological results. The mean glandular dose was less than 1 mGy in mammography and 12-13 mGy in CT and ABI-CT. The excellent visibility of in vitro breast cancer suggests that HR ABI-CT may have a valuable role in the future as an adjunct or even alternative to current breast diagnostics, when radiation dose is further decreased, and compact synchrotron radiation sources become available. PMID:21862846

  1. Analyses of the influencing factors of soil microbial functional gene diversity in tropical rainforest based on GeoChip 5.0

    Directory of Open Access Journals (Sweden)

    Jing Cong

    2015-09-01

    Full Text Available To examine soil microbial functional gene diversity and causative factors in tropical rainforests, we used a microarray-based metagenomic tool named GeoChip 5.0 to profile it. We found that high microbial functional gene diversity and different soil microbial metabolic potential for biogeochemical processes were considered to exist in tropical rainforest. Soil available nitrogen was the most associated with soil microbial functional gene structure. Here, we mainly describe the experiment design, the data processing, and soil biogeochemical analyses attached to the study in details, which could be published on BMC microbiology Journal in 2015, whose raw data have been deposited in NCBI's Gene Expression Omnibus (accession number GSE69171.

  2. A new internet-based tool for reporting and analysing patient-reported outcomes and the feasibility of repeated data collection from patients with myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Brochmann, Nana; Zwisler, Ann-Dorthe; Kjerholt, Mette;

    2016-01-01

    PURPOSE: An Internet-based tool for reporting and analysing patient-reported outcomes (PROs) has been developed. The tool enables merging PROs with blood test results and allows for computation of treatment responses. Data may be visualized by graphical analysis and may be exported for downstream...... statistical processing. The aim of this study was to investigate, whether patients with myeloproliferative neoplasms (MPNs) were willing and able to use the tool and fill out questionnaires regularly. METHODS: Participants were recruited from the outpatient clinic at the Department of Haematology, Roskilde...... University Hospital, Denmark. Validated questionnaires that were used were European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-Core 30, Myeloproliferative Neoplasm Symptom Assessment Form, Brief Fatigue Inventory and Short Form 36 Health Survey. Questionnaires were filled...

  3. Development of x-ray scintillator functioning also as an analyser grating used in grating-based x-ray differential phase contrast imaging

    Institute of Scientific and Technical Information of China (English)

    Lei Yao-Hu; Liu Xin; Guo Jin-Chuan; Zhao Zhi-Gang; Niu Han-Ben

    2011-01-01

    In order to push the grating-based phase contrast imaging system to be used in hospital and laboratories, this paper designs and develops a novel structure of x-ray scintillator functioning also as an analyser grating, which has been proposed for grating-based x-ray differential phase contrast imaging. According to this design, the scintillator should have a periodical structure in one dimension with the pitch equaling the period of self-image of the phase grating at the Talbot distance, where one half of the pitch is pixellated and is made of x-ray sensitive fluorescent material, such as CsI(TI), and the remaining part of the pitch is made of x-ray insensitive material, such as silicon. To realize the design, a deep pore array with a high aspect ratio and specially designed grating pattern are successfully manufactured on 5 inch silicon wafer by the photo-assisted electrochemical etching method. The related other problems, such as oxidation-caused geometrical distortion, the filling of CsI(Tl)into deep pores and the removal of inside bubbles, have been overcome.Its pixel size, depth and grating pitch are 3 μLmx7.5 gm, 150 μm and 3 μm, respectively. The microstructure of the scintillator has been examined microscopically and macroscopically by scanning electron microscope and x-ray resolution chart testing, respectively. The preliminary measurements have shown that the proposed scintillator, also functioning as an analyser grating, has been successfully designed and developed.

  4. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. PMID:25622296

  5. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-15

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the

  6. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    International Nuclear Information System (INIS)

    deterioration over the assessment period. The basic approach for prescribing such margins is to consider whether the design assessed in SR-Can Main report was sufficient to result in safety. In case this design would imply too strict requirements, and in cases the SR-Can design was judged inadequate or not sufficiently analysed in the SR-Can report, some additional analyses have been undertaken to provide a better basis for setting the design premises. The resulting design premises constitute design constraints, which, if all fulfilled, form a good basis for demonstrating repository safety, according to the analyses in SR-Can and subsequent analyses. Some of the design premises may be modified in future stages of SKB's programme, as a result of analyses based on more detailed site data and a more developed understanding of processes of importance for long-term safety. Furthermore, a different balance between design requirements may result in the same level of safety. This report presents one technically reasonable balance, whereas future development and evaluations may result in other balances being deemed as more optimal. It should also be noted that in developing the reference design, the production reports should give credible evidence that the final product after construction and quality control fulfils the specifications of the reference design. To cover uncertainties in production and quality control that may be difficult to quantify in detail at the present design stage, the developer of the reference design need usually consider a margin to the conditions that would verify the design premises, but whether there is a need for such margins lies outside the scope of the current document. The term 'withstand' is used in this document in descriptions of load cases on repository components. The statement that a component withstands a particular load means that it upholds its related safety function when exposed to the load in question. For example, if the canister is said to

  7. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  8. Effect of a novel motion correction algorithm (SSF) on the image quality of coronary CTA with intermediate heart rates: Segment-based and vessel-based analyses

    Energy Technology Data Exchange (ETDEWEB)

    Li, Qianwen, E-mail: qianwen18@126.com; Li, Pengyu, E-mail: lipyu818@gmail.com; Su, Zhuangzhi, E-mail: suzhuangzhi@xwh.ccmu.edu.cn; Yao, Xinyu, E-mail: 314985151@qq.com; Wang, Yan, E-mail: wy19851121@126.com; Wang, Chen, E-mail: fskwangchen@gmail.com; Du, Xiangying, E-mail: duxying_xw@163.com; Li, Kuncheng, E-mail: kuncheng.li@gmail.com

    2014-11-15

    Highlights: • SSF provided better image quality than single-sector and bi-sector reconstruction among the intermediate heart rates (65–75 bpm). • Evidence for the application of prospective ECG-triggered coronary CTA with SSF onto an expanded heart rate range. • Information about the inconsistent effectiveness of SSF among the segments of coronary artery. - Abstract: Purpose: To evaluate the effect of SnapShot Freeze (SSF) reconstruction at an intermediate heart-rate (HR) range (65–75 bpm) and compare this method with single-sector reconstruction and bi-sector reconstruction on segmental and vessel bases in retrospective coronary computed tomography angiography (CCTA). Materials and methods: Retrospective electrocardiogram-gated CCTA was performed on 37 consecutive patients with HR between 65 and 75 bpm using a 64-row CT scanner. Retrospective single-sector reconstruction, bi-sector reconstruction, and SSF were performed for each patient. Multi-phase single-sector reconstruction was performed to select the optimal phase. SSF and bi-sector images were also reconstructed at the optimal phase. The images were interpreted in an intent-to-diagnose fashion by two experienced readers using a 5-point scale, with 3 points as diagnostically acceptable. Image quality among the three reconstruction groups were compared on per-patient, per-vessel, and per-segment bases. Results: The average HR of the enrolled patients was 69.4 ± 2.7 bpm. A total of 111 vessels and 481 coronary segments were assessed. SSF provided significantly higher interpretability of the coronary segments than bi-sector reconstructions. The qualified and excellent rates of SSF (97.9% and 82.3%) were significantly higher than those of single-sector (92.9% and 66.3%) and bi-sector (90.9% and 64.7%) reconstructions. The image quality score (IQS) using SSF was also significantly higher than those of single-sector and bi-sector reconstructions both on per-patient and per-vessel bases. On per

  9. Effect of a novel motion correction algorithm (SSF) on the image quality of coronary CTA with intermediate heart rates: Segment-based and vessel-based analyses

    International Nuclear Information System (INIS)

    Highlights: • SSF provided better image quality than single-sector and bi-sector reconstruction among the intermediate heart rates (65–75 bpm). • Evidence for the application of prospective ECG-triggered coronary CTA with SSF onto an expanded heart rate range. • Information about the inconsistent effectiveness of SSF among the segments of coronary artery. - Abstract: Purpose: To evaluate the effect of SnapShot Freeze (SSF) reconstruction at an intermediate heart-rate (HR) range (65–75 bpm) and compare this method with single-sector reconstruction and bi-sector reconstruction on segmental and vessel bases in retrospective coronary computed tomography angiography (CCTA). Materials and methods: Retrospective electrocardiogram-gated CCTA was performed on 37 consecutive patients with HR between 65 and 75 bpm using a 64-row CT scanner. Retrospective single-sector reconstruction, bi-sector reconstruction, and SSF were performed for each patient. Multi-phase single-sector reconstruction was performed to select the optimal phase. SSF and bi-sector images were also reconstructed at the optimal phase. The images were interpreted in an intent-to-diagnose fashion by two experienced readers using a 5-point scale, with 3 points as diagnostically acceptable. Image quality among the three reconstruction groups were compared on per-patient, per-vessel, and per-segment bases. Results: The average HR of the enrolled patients was 69.4 ± 2.7 bpm. A total of 111 vessels and 481 coronary segments were assessed. SSF provided significantly higher interpretability of the coronary segments than bi-sector reconstructions. The qualified and excellent rates of SSF (97.9% and 82.3%) were significantly higher than those of single-sector (92.9% and 66.3%) and bi-sector (90.9% and 64.7%) reconstructions. The image quality score (IQS) using SSF was also significantly higher than those of single-sector and bi-sector reconstructions both on per-patient and per-vessel bases. On per

  10. All cause mortality and the case for age specific alcohol consumption guidelines: pooled analyses of up to 10 population based cohorts

    Science.gov (United States)

    Coombs, Ngaire; Stamatakis, Emmanuel; Biddulph, Jane P

    2015-01-01

    Objectives To examine the suitability of age specific limits for alcohol consumption and to explore the association between alcohol consumption and mortality in different age groups. Design Population based data from Health Survey for England 1998-2008, linked to national mortality registration data and pooled for analysis using proportional hazards regression. Analyses were stratified by sex and age group (50-64 and ≥65 years). Setting Up to 10 waves of the Health Survey for England, which samples the non-institutionalised general population resident in England. Participants The derivation of two analytical samples was based on the availability of comparable alcohol consumption data, covariate data, and linked mortality data among adults aged 50 years or more. Two samples were used, each utilising a different variable for alcohol usage: self reported average weekly consumption over the past year and self reported consumption on the heaviest day in the past week. In fully adjusted analyses, the former sample comprised Health Survey for England years 1998-2002, 18 368 participants, and 4102 deaths over a median follow-up of 9.7 years, whereas the latter comprised Health Survey for England years 1999-2008, 34 523 participants, and 4220 deaths over a median follow-up of 6.5 years. Main outcome measure All cause mortality, defined as any death recorded between the date of interview and the end of data linkage on 31 March 2011. Results In unadjusted models, protective effects were identified across a broad range of alcohol usage in all age-sex groups. These effects were attenuated across most use categories on adjustment for a range of personal, socioeconomic, and lifestyle factors. After the exclusion of former drinkers, these effects were further attenuated. Compared with self reported never drinkers, significant protective associations were limited to younger men (50-64 years) and older women (≥65 years). Among younger men, the range of protective effects was

  11. Long and short-term atmospheric radiation analyses based on coupled measurements at high altitude remote stations and extensive air shower modeling

    Science.gov (United States)

    Hubert, G.; Federico, C. A.; Pazianotto, M. T.; Gonzales, O. L.

    2016-02-01

    In this paper are described the ACROPOL and OPD high-altitude stations devoted to characterize the atmospheric radiation fields. The ACROPOL platform, located at the summit of the Pic du Midi in the French Pyrenees at 2885 m above sea level, exploits since May 2011 some scientific equipment, including a BSS neutron spectrometer, detectors based on semiconductor and scintillators. In the framework of a IEAv and ONERA collaboration, a second neutron spectrometer was simultaneously exploited since February 2015 at the summit of the Pico dos Dias in Brazil at 1864 m above the sea level. The both high station platforms allow for investigating the long period dynamics to analyze the spectral variation of cosmic-ray- induced neutron and effects of local and seasonal changes, but also the short term dynamics during solar flare events. This paper presents long and short-term analyses, including measurement and modeling investigations considering the both high altitude stations data. The modeling approach, based on ATMORAD computational platform, was used to link the both station measurements.

  12. Waste dumps rehabilitation measures based on physico-chemical analyses in Zăghid mining area (Sălaj County, Romania

    Directory of Open Access Journals (Sweden)

    Ildiko M. Varga

    2011-08-01

    Full Text Available The present study deals with an abandoned coal mine from Zăghid area, North-WesternTransylvanian Basin (Sălaj County. The mining activity was stopped in 2005, without any attempt ofecological rehabilitation of the mined area and especially of the waste dumps left behind. The proposedrehabilitation models are based on some physical-chemical analyses of soil and waste samples (e.g. pH,EC, Salinity, humidity, porosity, density, plasticity, organic substances, mineralogical composition, heavymetals. Erosion map has been drawn based on the determined mineralogical composition (accordingSTAS 1913/5-85 – using Galton curve of tailings and the soil type. The values obtained for moisture andplasticity have been used to determine the ideal general inclination angle of the landfill systems in thestudied perimeter. Through chemical analysis, heavy metals like Ni and Cu have been identified, as themain pollution factors for surface and underground water. Therefore, the concentration of heavy metalsin the waters from Zăghid area is high in the water bodies, which are formed on waste dumps, but alsoin the mine water. This analysis is useful in establishing the actual state of the waste dumps and theircontent and the negative effects, which exercise on the environment in order to select the rehabilitationmodel for the waste dumps from Zăghid mining area. The main measures consist in: waste dumpsleveling, soil remediation, perennial plants culture and acid mine water decontamination.

  13. Clinical map document based on XML (cMDX: document architecture with mapping feature for reporting and analysing prostate cancer in radical prostatectomy specimens

    Directory of Open Access Journals (Sweden)

    Bettendorf Olaf

    2010-11-01

    Full Text Available Abstract Background The pathology report of radical prostatectomy specimens plays an important role in clinical decisions and the prognostic evaluation in Prostate Cancer (PCa. The anatomical schema is a helpful tool to document PCa extension for clinical and research purposes. To achieve electronic documentation and analysis, an appropriate documentation model for anatomical schemas is needed. For this purpose we developed cMDX. Methods The document architecture of cMDX was designed according to Open Packaging Conventions by separating the whole data into template data and patient data. Analogue custom XML elements were considered to harmonize the graphical representation (e.g. tumour extension with the textual data (e.g. histological patterns. The graphical documentation was based on the four-layer visualization model that forms the interaction between different custom XML elements. Sensible personal data were encrypted with a 256-bit cryptographic algorithm to avoid misuse. In order to assess the clinical value, we retrospectively analysed the tumour extension in 255 patients after radical prostatectomy. Results The pathology report with cMDX can represent pathological findings of the prostate in schematic styles. Such reports can be integrated into the hospital information system. "cMDX" documents can be converted into different data formats like text, graphics and PDF. Supplementary tools like cMDX Editor and an analyser tool were implemented. The graphical analysis of 255 prostatectomy specimens showed that PCa were mostly localized in the peripheral zone (Mean: 73% ± 25. 54% of PCa showed a multifocal growth pattern. Conclusions cMDX can be used for routine histopathological reporting of radical prostatectomy specimens and provide data for scientific analysis.

  14. Time point-based integrative analyses of deep-transcriptome identify four signal pathways in blastemal regeneration of zebrafish lower jaw.

    Science.gov (United States)

    Zhang, Hui; Wang, Xuelong; Lyu, Kailun; Gao, Siqi; Wang, Guan; Fan, Chunxin; Zhang, Xin A; Yan, Jizhou

    2015-03-01

    There has been growing interest in applying tissue engineering to stem cell-based regeneration therapies. We have previously reported that zebrafish can faithfully regenerate complicated tissue structures through blastemal cell type conversions and tissue reorganization. To unveil the regenerative factors and engineering arts of blastemal regeneration, we conducted transcriptomal analyses at four time points corresponding to preamputation, re-epitheliation, blastemal formation, and respecification. By combining the hierarchical gene ontology term network, the DAVID annotation system, and Euclidean distance clustering, we identified four signaling pathways: foxi1-foxo1b-pou3f1, pax3a-mant3a-col11/col2, pou5f1-cdx4-kdrl, and isl1-wnt11 PCP-sox9a. Results from immunohistochemical staining and promoter-driven transgenic fish suggest that these pathways, respectively, define wound epidermis reconstitution, cell type conversions, blastemal angiogenesis/vasculogenesis, and cartilage matrix-orientation. Foxi1 morpholino-knockdown caused expansions of Foxo1b- and Pax3a-expression in the basal layer-blastemal junction region. Moreover, foxi1 morphants displayed increased sox9a and hoxa2b transcripts in the embryonic pharyngeal arches. Thus, a Foxi1 signal switch is required to establish correct tissue patterns, including re-epitheliation and blastema formation. This study provides novel insight into a blastema regeneration strategy devised by epithelial cell transdifferentiation, blood vessel engineering, and cartilage matrix deposition. PMID:25420467

  15. Associations of indoor carbon dioxide concentrations and environmental susceptibilities with mucous membrane and lower respiratory building related symptoms in the BASE study: Analyses of the 100 building dataset

    Energy Technology Data Exchange (ETDEWEB)

    Erdmann, Christine A.; Apte, Michael G.

    2003-09-01

    Using the US EPA 100 office-building BASE Study dataset, they conducted multivariate logistic regression analyses to quantify the relationship between indoor CO{sub 2} concentrations (dCO{sub 2}) and mucous membrane (MM) and lower respiratory system (LResp) building related symptoms, adjusting for age, sex, smoking status, presence of carpet in workspace, thermal exposure, relative humidity, and a marker for entrained automobile exhaust. In addition, they tested the hypothesis that certain environmentally-mediated health conditions (e.g., allergies and asthma) confer increased susceptibility to building related symptoms within office buildings. Adjusted odds ratios (ORs) for statistically significant, dose-dependent associations (p < 0.05) for dry eyes, sore throat, nose/sinus congestion, and wheeze symptoms with 100 ppm increases in dCO{sub 2} ranged from 1.1 to 1.2. These results suggest that increases in the ventilation rates per person among typical office buildings will, on average, reduce the prevalence of several building related symptoms by up to 70%, even when these buildings meet the existing ASHRAE ventilation standards for office buildings. Building occupants with certain environmentally-mediated health conditions are more likely to experience building related symptoms than those without these conditions (statistically significant ORs ranged from 2 to 11).

  16. Integrated ecotoxicological assessment of marine sediments affected by land-based marine fish farm effluents: physicochemical, acute toxicity and benthic community analyses.

    Science.gov (United States)

    Silva, C; Yáñez, E; Martín-Díaz, M L; Riba, I; DelValls, T A

    2013-08-01

    An integrated ecotoxicological assessment of marine sediments affected by land-based marine fish farm effluents was developed using physicochemical and benthic community structure analyses and standardised laboratory bioassays with bacteria (Vibrio fischeri), amphipods (Ampelisca brevicornis) and sea urchin larvae (Paracentrotus lividus). Intertidal sediment samples were collected at five sites of the Rio San Pedro (RSP) creek, from the aquaculture effluent to a clean site. The effective concentration (EC50) from bacterial bioluminescence and A. brevicornis survival on whole sediments and P. lividus larval developmental success on sediment elutriates were assessed. Numbers of species, abundance and Shannon diversity were the biodiversity indicators measured in benthic fauna of sediment samples. In parallel, redox potential, pH, organic matter and metal levels (Cd, Cu, Ni, Pb and Zn) in the sediment and dissolved oxygen in the interstitial water were measured in situ. Water and sediment physicochemical analysis revealed the exhibition of a spatial gradient in the RSP, evidenced by hypoxia/anoxia, reduced and acidic conditions, high organic enrichment and metal concentrations at the most contaminated sites. Whereas, the benthic fauna biodiversity decreased the bioassays depicted decreases in EC50, A. brevicornis survival, P. lividus larval success at sampling sites closer to the studied fish farms. This study demonstrates that the sediments polluted by fish farm effluents may lead to alterations of the biodiversity of the exposed organisms. PMID:23681739

  17. Association between forgone care and household income among the elderly in five Western European countries – analyses based on survey data from the SHARE-study

    Directory of Open Access Journals (Sweden)

    Stirbu Irina

    2009-03-01

    Full Text Available Abstract Background Studies on the association between access to health care and household income have rarely included an assessment of 'forgone care', but this indicator could add to our understanding of the inverse care law. We hypothesize that reporting forgone care is more prevalent in low income groups. Methods The study is based on the 'Survey of Health, Ageing and Retirement in Europe (SHARE', focusing on the non-institutionalized population aged 50 years or older. Data are included from France, Germany, Greece, Italy and Sweden. The dependent variable is assessed by the following question: During the last twelve months, did you forgo any types of care because of the costs you would have to pay, or because this care was not available or not easily accessible? The main independent variable is household income, adjusted for household size and split into quintiles, calculating the quintile limits for each country separately. Information on age, sex, self assessed health and chronic disease is included as well. Logistic regression models were used for the multivariate analyses. Results The overall level of forgone care differs considerably between the five countries (e.g. about 10 percent in Greece and 6 percent in Sweden. Low income groups report forgone care more often than high income groups. This association can also be found in analyses restricted to the subsample of persons with chronic disease. Associations between forgone care and income are particularly strong in Germany and Greece. Taking the example of Germany, forgone care in the lowest income quintile is 1.98 times (95% CI: 1.08–3.63 as high as in the highest income quintile. Conclusion Forgone care should be reduced even if it is not justified by an 'objective' need for health care, as it could be an independent stressor in its own right, and as patient satisfaction is a strong predictor of compliance. These efforts should focus on population groups with particularly high

  18. Simultaneous detection of eight swine reproductive and respiratory pathogens using a novel GeXP analyser-based multiplex PCR assay.

    Science.gov (United States)

    Zhang, Minxiu; Xie, Zhixun; Xie, Liji; Deng, Xianwen; Xie, Zhiqin; Luo, Sisi; Liu, Jiabo; Pang, Yaoshan; Khan, Mazhar I

    2015-11-01

    A new high-throughput GenomeLab Gene Expression Profiler (GeXP) analyser-based multiplex PCR assay was developed for the detection of eight reproductive and respiratory pathogens in swine. The reproductive and respiratory pathogens include North American porcine reproductive and respiratory syndrome virus (PRRSV-NA), classical swine fever virus (CSFV), porcine circovirus 2 (PCV-2), swine influenza virus (SIV) (including H1 and H3 subtypes), porcine parvovirus (PPV), pseudorabies virus (PRV) and Japanese encephalitis virus (JEV). Nine pairs of specific chimeric primers were designed and used to initiate PCRs, and one pair of universal primers was used for subsequent PCR cycles. The specificity of the GeXP assay was examined using positive controls for each virus. The sensitivity was evaluated using serial ten-fold dilutions of in vitro-transcribed RNA from all of the RNA viruses and plasmids from DNA viruses. The GeXP assay was further evaluated using 114 clinical specimens and was compared with real-time PCR/single RT-PCR methods. The specificity of the GeXP assay for each pathogen was examined using single cDNA/DNA template. Specific amplification peaks of the reproductive and respiratory pathogens were observed on the GeXP analyser. The minimum copies per reaction detected for each virus by the GeXP assay were as follows: 1000 copies/μl for PRV; 100 copies/μl for CSFV, JEV, PCV-2 and PPV; and 10 copies/μl for SIV-H1, SIV-H3 and PRRSV-NA. Analysis of 114 clinical samples using the GeXP assay demonstrated that the GeXP assay had comparable detection to real-time PCR/single RT-PCR. This study demonstrated that the GeXP assay is a new method with high sensitivity and specificity for the identification of these swine reproductive and respiratory pathogens. The GeXP assay may be adopted for molecular epidemiological surveys of these reproductive and respiratory pathogens in swine populations. PMID:26259690

  19. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  20. IPD-Work consortium: pre-defined meta-analyses of individual-participant data strengthen evidence base for a link between psychosocial factors and health.

    Science.gov (United States)

    Kivimäki, Mika; Singh-Manoux, Archana; Virtanen, Marianna; Ferrie, Jane E; Batty, G David; Rugulies, Reiner

    2015-05-01

    Established in 2008 and comprising over 60 researchers, the IPD-Work (individual-participant data meta-analysis in working populations) consortium is a collaborative research project that uses pre-defined meta-analyses of individual-participant data from multiple cohort studies representing a range of countries. The aim of the consortium is to estimate reliably the associations of work-related psychosocial factors with chronic diseases, disability, and mortality. Our findings are highly cited by the occupational health, epidemiology, and clinical medicine research community. However, some of the IPD-Work's findings have also generated disagreement as they challenge the importance of job strain as a major target for coronary heart disease (CHD) prevention, this is reflected in the critical discussion paper by Choi et al (1). In this invited reply to Choi et al, we aim to (i) describe how IPD-Work seeks to advance research on associations between work-related psychosocial risk factors and health; (ii) demonstrate as unfounded Choi et al's assertion that IPD-Work has underestimated associations between job strain and health endpoints; these include the dichotomous measurement of job strain, potential underestimation of the population attributable risk (PAR) of job strain for CHD, and policy implications arising from the findings of the IPD-Work consortium; and (iii) outline general principles for designing evidence-based policy and prevention from good-quality evidence, including future directions for research on psychosocial factors at work and health. In addition, we highlight some problems with Choi et al's approach. PMID:25654401

  1. Revisiting the phylogeny of Bombacoideae (Malvaceae): Novel relationships, morphologically cohesive clades, and a new tribal classification based on multilocus phylogenetic analyses.

    Science.gov (United States)

    Carvalho-Sobrinho, Jefferson G; Alverson, William S; Alcantara, Suzana; Queiroz, Luciano P; Mota, Aline C; Baum, David A

    2016-08-01

    Bombacoideae (Malvaceae) is a clade of deciduous trees with a marked dominance in many forests, especially in the Neotropics. The historical lack of a well-resolved phylogenetic framework for Bombacoideae hinders studies in this ecologically important group. We reexamined phylogenetic relationships in this clade based on a matrix of 6465 nuclear (ETS, ITS) and plastid (matK, trnL-trnF, trnS-trnG) DNA characters. We used maximum parsimony, maximum likelihood, and Bayesian inference to infer relationships among 108 species (∼70% of the total number of known species). We analyzed the evolution of selected morphological traits: trunk or branch prickles, calyx shape, endocarp type, seed shape, and seed number per fruit, using ML reconstructions of their ancestral states to identify possible synapomorphies for major clades. Novel phylogenetic relationships emerged from our analyses, including three major lineages marked by fruit or seed traits: the winged-seed clade (Bernoullia, Gyranthera, and Huberodendron), the spongy endocarp clade (Adansonia, Aguiaria, Catostemma, Cavanillesia, and Scleronema), and the Kapok clade (Bombax, Ceiba, Eriotheca, Neobuchia, Pachira, Pseudobombax, Rhodognaphalon, and Spirotheca). The Kapok clade, the most diverse lineage of the subfamily, includes sister relationships (i) between Pseudobombax and "Pochota fendleri" a historically incertae sedis taxon, and (ii) between the Paleotropical genera Bombax and Rhodognaphalon, implying just two bombacoid dispersals to the Old World, the other one involving Adansonia. This new phylogenetic framework offers new insights and a promising avenue for further evolutionary studies. In view of this information, we present a new tribal classification of the subfamily, accompanied by an identification key. PMID:27154210

  2. Evaluation of impact factors on PM2.5 based on long-term chemical components analyses in the megacity Beijing, China.

    Science.gov (United States)

    Chen, Yuan; Schleicher, Nina; Cen, Kuang; Liu, Xiuli; Yu, Yang; Zibat, Volker; Dietze, Volker; Fricker, Mathieu; Kaminski, Uwe; Chen, Yizhen; Chai, Fahe; Norra, Stefan

    2016-07-01

    Nine years of sampling and analyses of fine particles (PM2.5) were performed in Beijing from 2005 to 2013. Twenty-seven chemical elements and black carbon (BC) in PM2.5 were analyzed in order to study chemical characteristics and temporal distribution of Beijing aerosols. Principle component analysis defined different types of elemental sources, based on which, the influences of a variety of anthropogenic activities including governmental intervention measures and natural sources on air quality were evaluated. For the first time, Ga is used as a tracer element for heating activities mainly using coal in Beijing, due to its correlation with BC and coal combustion, as well as its concentration variation between the heating- and non-heating periods. The traffic restrictions effectively reduced emissions of relevant heavy metals such as As, Cd, Sn and Sb. The expected long-term effectiveness of the steel smelters relocation was not observed due to the nearby relocation with increased capacity. Firework display during every Chinese spring festival season and special events such as the Olympic Games resulted in several times higher concentrations of K, Sr and Ba than other days and thus they were proposed as tracers for firework display. The impacts of all these factors were quantified and evaluated. Sand dust or dust storms induced higher concentrations of geogenic elements in PM2.5 compared to non-dust days. Sustainable mitigation measures, such as traffic restrictions, are necessary to be continued and improved to obtain more "blue sky" days in the future. PMID:27115848

  3. Comparative analyses of factors determining soil erosion rates based on network of Mediterranean monitored catchments for the innovative, adaptive and resilient agriculture of the future

    Science.gov (United States)

    Smetanová, Anna; Le Bissonnais, Yves; Raclot, Damien; Perdo Nunes, João; Licciardello, Feliciana; Mathys, Nicolle; Latron, Jérôme; Rodríguez Caballero, Emilio; Le Bouteiller, Caroline; Klotz, Sébastien; Mekki, Insaf; Gallart, Francesc; Solé Benet, Albert; Pérez Gallego, Nuria; Andrieux, Patrick; Jantzi, Hugo; Moussa, Roger; Planchon, Olivier; Marisa Santos, Juliana

    2015-04-01

    In order to project the soil erosion response to climate change in the fragile Mediterranean region it is inevitable to understand its existing patterns. Soil erosion monitoring on a catchment scale enables to analyse temporal and spatial variability of soil erosion and sediment delivery, while the integrating study of different catchments is often undertaken to depicther the general patterns. In this study, eight small catchments (with area up to 1,32 km2), representative for the western part of the Mediterranean region (according to climate, bedrock, soils and main type of land use) were compared. These catchments, grouped in the R-OS Med Network were situated in France (3), Spain (2), Portugal (1), Italy (1) and Tunisia (1). The average precipitation ranged between 236 to 1303 mm·a-1 and mean annual sediment yield varied 7.5 to 6900 Mg·km-2·a-1. The complex databes was based on more than 120 years of hydrological and sediment data, with series between 3 and 29 years long. The variability of sediment data was described on annual and monthly basis. The relationship between the sediment yield and more than 35 factors influencing the sediment yield including the characteristics of climate, topography, rainfall, runoff, land use, vegetation and soil cover, connectivity and dominant geomorphic processes, was studied. The preliminary results confirmed the differences in rainfall, runoff and sediment response, and revealed both the similarities and differences in soil erosion responses of the catchments. They are further dependent on the variability of factors themselves, with important contribution of the state of soil properties, vegetation cover and land use. Anna Smetanová has received the support of the European Union, in the framework of the Marie-Curie FP7 COFUND People Programme, through the award of an AgreenSkills' fellowship (under grant agreement n° 267196)

  4. The impact of ancestral heath management on soils and landscapes. A reconstruction based on paleoecological analyses of soil records in the middle and southeast Netherlands.

    Science.gov (United States)

    van Mourik, Jan; Doorenbosch, Marieke

    2016-04-01

    The evolution of heath lands during the Holocene has been registered in various soil records . Paleoecological analyses of these records enable to reconstruct the changing economic and cultural management of heaths and the consequences for landscape and soils. Heaths are characteristic components of cultural landscape mosaics on sandy soils in the Netherlands. The natural habitat of heather species was moorland. At first, natural events like forest fires and storms caused small-scale forest degradation, in addition on the forest degradation accelerated due to cultural activities like forest grazing, wood cutting and shifting cultivation. Heather plants invaded on degraded forest soils and heaths developed. People learned to use the heaths for economic and cultural purposes. The impact of the heath management on landscape and soils was registered in soil records of barrows, drift sand sequences and plaggic Anthrosols. Based on pollen diagrams of such records we could reconstruct that heaths were developed and used for cattle grazing before the Bronze Age. During the Late Neolithic, the Bronze Age and Iron Age, people created the barrow landscape on the ancestral heaths. After the Iron Age people probably continued with cattle grazing on the heaths and plaggic agriculture until the Early Middle Ages. After 1000 AD two events affected the heaths. At first deforestation for the sale of wood resulted in the first regional extension of sand drifting and heath degradation. After that the introduction of the deep stable economy and heath sods digging resulted in acceleration of the rise of plaggic horizons, severe heath degradation and the second extension of sand drifting. At the end of the 19th century the heath lost its economic value due to the introduction of chemical fertilizers. The heaths were transformed into 'new' arable fields and forests and due to deep ploughing most soil archives were destroyed. Since 1980 AD, the remaining relicts of the ancestral heaths are

  5. Method for evaluating an extended Fault Tree to analyse the dependability of complex systems:application to a satellite-based railway system

    OpenAIRE

    NGUYEN, Khanh; Beugin, Julie; MARAIS, Juliette

    2015-01-01

    Evaluating dependability of complex systems requires the evolution of the system states over time to be analysed. The problem is to develop modelling approaches that take adequately the evolution of the different operating and failed states of the system components into account. The Fault Tree (FT) is a well-known method that efficiently analyse the failure causes of a system and serves for reliability and availability evaluations. As FT are not adapted to dynamic systems with repairable mult...

  6. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences

  7. Chapter 5. Safety analyses

    International Nuclear Information System (INIS)

    the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213. The main results of the previous analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated in the year 2000 within the project. Possible severe accident mitigation and preventive measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. After finishing the project the obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project, which makes a part of the 5th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2000 the familiarisation with the computer code and some check calculations has been started. The work on the project will continue also in 2001

  8. The impact of ancestral heath management on soils and landscapes: a reconstruction based on paleoecological analyses of soil records in the central and southeastern Netherlands

    Science.gov (United States)

    Doorenbosch, Marieke; van Mourik, Jan M.

    2016-07-01

    The evolution of heathlands during the Holocene has been registered in various soil records. Paleoecological analyses of these records enable reconstruction of the changing economic and cultural management of heaths and the consequences for landscape and soils. Heaths are characteristic components of cultural landscape mosaics on sandy soils in the Netherlands. The natural habitat of heather species was moorland. At first, natural events like forest fires and storms caused small-scale forest degradation; in addition on that, the forest degradation accelerated due to cultural activities like forest grazing, wood cutting, and shifting cultivation. Heather plants invaded degraded forest soils, and heaths developed. People learned to use the heaths for economic and cultural purposes. The impact of the heath management on landscape and soils was registered in soil records of barrows, drift sand sequences, and plaggic Anthrosols. Based on pollen diagrams of such records we could reconstruct that heaths were developed and used for cattle grazing before the Bronze Age. During the late Neolithic, the Bronze Age, and Iron Age, people created the barrow landscape on the ancestral heaths. After the Iron Age, people probably continued with cattle grazing on the heaths and plaggic agriculture until the early Middle Ages. Severe forest degradation by the production of charcoal for melting iron during the Iron Age till the 6th-7th century and during the 11th-13th century for the trade of wood resulted in extensive sand drifting, a threat to the valuable heaths. The introduction of the deep, stable economy and heath sods digging in the course of the 18th century resulted in acceleration of the rise of plaggic horizons, severe heath degradation, and again extension of sand drifting. At the end of the 19th century heath lost its economic value due to the introduction of chemical fertilizers. The heaths were transformed into "new" arable fields and forests, and due to deep ploughing

  9. The role of safety analyses in site selection. Some personal observations based on the experience from the Swiss site selection process

    Energy Technology Data Exchange (ETDEWEB)

    Zuidema, Piet [Nagra, Wettingen (Switzerland)

    2015-07-01

    In Switzerland, the site selection process according to the ''Sectoral Plan for Deep Geological Repositories'' (BFE 2008) is underway since 2008. This process takes place in three stages. In stage 1 geological siting regions (six for the L/ILW repository and three for the HLW repository) have been identified, in stage 2 sites for the surface facilities have been identified for all siting regions in close co-operation with the sting regions and a narrowing down of the number of siting regions based on geological criteria will take place. In stage 3 the sites for a general license application are selected and the general license applications will be submitted which eventually will lead to the siting decision for both repository types. In the Swiss site selection process, safety has the highest priority. Many factors affect safety and thus a whole range of safety-related issues are considered in the identification and screening of siting possibilities. Besides dose calculations a range of quantitative and qualitative issues are considered. Dose calculations are performed in all three stages of the site selection process. In stage 1 generic safety calculations were made to develop criteria to be used for the identification of potential siting regions. In stage 2, dose calculations are made for comparing the different siting regions according to a procedure prescribed in detail by the regulator. Combined with qualitative evaluations this will lead to a narrowing down of the number of siting regions to at least two siting regions for each repository type. In stage 3 full safety cases will be prepared as part of the documentation for the general license applications. Besides the dose calculations, many other issues related to safety are analyzed in a quantitative and qualitative manner. These consider the 13 criteria defined in the Sectoral Plan and the corresponding indicators. The features analyzed cover the following broad themes: efficiency of

  10. The role of safety analyses in site selection. Some personal observations based on the experience from the Swiss site selection process

    International Nuclear Information System (INIS)

    In Switzerland, the site selection process according to the ''Sectoral Plan for Deep Geological Repositories'' (BFE 2008) is underway since 2008. This process takes place in three stages. In stage 1 geological siting regions (six for the L/ILW repository and three for the HLW repository) have been identified, in stage 2 sites for the surface facilities have been identified for all siting regions in close co-operation with the sting regions and a narrowing down of the number of siting regions based on geological criteria will take place. In stage 3 the sites for a general license application are selected and the general license applications will be submitted which eventually will lead to the siting decision for both repository types. In the Swiss site selection process, safety has the highest priority. Many factors affect safety and thus a whole range of safety-related issues are considered in the identification and screening of siting possibilities. Besides dose calculations a range of quantitative and qualitative issues are considered. Dose calculations are performed in all three stages of the site selection process. In stage 1 generic safety calculations were made to develop criteria to be used for the identification of potential siting regions. In stage 2, dose calculations are made for comparing the different siting regions according to a procedure prescribed in detail by the regulator. Combined with qualitative evaluations this will lead to a narrowing down of the number of siting regions to at least two siting regions for each repository type. In stage 3 full safety cases will be prepared as part of the documentation for the general license applications. Besides the dose calculations, many other issues related to safety are analyzed in a quantitative and qualitative manner. These consider the 13 criteria defined in the Sectoral Plan and the corresponding indicators. The features analyzed cover the following broad themes: efficiency of

  11. Streptococcal taxonomy based on genome sequence analyses [v1; ref status: indexed, http://f1000r.es/o1

    Directory of Open Access Journals (Sweden)

    Cristiane C Thompson

    2013-03-01

    Full Text Available The identification of the clinically relevant viridans streptococci group, at species level, is still problematic. The aim of this study was to extract taxonomic information from the complete genome sequences of 67 streptococci, comprising 19 species, by means of genomic analyses, multilocus sequence analysis (MLSA, average amino acid identity (AAI, genomic signatures, genome-to-genome distances (GGD and codon usage bias. We then attempted to determine the usefulness of these genomic tools for species identification in streptococci. Our results showed that MLSA, AAI and GGD analyses are robust markers to identify streptococci at the species level, for instance, S. pneumoniae, S. mitis, and S. oralis. A Streptococcus species can be defined as a group of strains that share ≥ 95% DNA similarity in MLSA and AAI, and > 70% DNA identity in GGD. This approach allows an advanced understanding of bacterial diversity.

  12. Identification of Distinct Breast Cancer Stem Cell Populations Based on Single-Cell Analyses of Functionally Enriched Stem and Progenitor Pools

    OpenAIRE

    Nina Akrap; Daniel Andersson; Eva Bom; Pernilla Gregersson; Anders Ståhlberg; Göran Landberg

    2016-01-01

    Summary The identification of breast cancer cell subpopulations featuring truly malignant stem cell qualities is a challenge due to the complexity of the disease and lack of general markers. By combining extensive single-cell gene expression profiling with three functional strategies for cancer stem cell enrichment including anchorage-independent culture, hypoxia, and analyses of low-proliferative, label-retaining cells derived from mammospheres, we identified distinct stem cell clusters in b...

  13. An online mass-based gas analyser for simultaneous determination of H2, CH4, CO, N2 and CO2: an automated sensor for process monitoring in industry

    International Nuclear Information System (INIS)

    An automated gas analyser has been designed, constructed and installed for online monitoring of H2, CH4, CO and CO2 in the reduction plant at Mobarakeh Steel Company, Iran. A small and low-resolution mass spectrometer is used in this instrument. The analyser accepts the sample directly from the ambient pressure. Mass spectra are converted to percentages of the species in the mixture based on a derived mathematical expression and especially developed software. The instrument is capable of simultaneously analyzing six different gas inputs. The instrument recorded a precision level of below 3%. (paper)

  14. Heat integration options based on pinch and exergy analyses of a thermosolar and heat pump in a fish tinning industrial process

    International Nuclear Information System (INIS)

    Thermosolar technology is being inserted gradually in industrial activities. In order to reach high energy efficiency, thermosolar can be linked to heat pump technology, combining more efficient conventional and renewable energy support for processes. Their integration in complex processes can be improved systematically through well established analytical tools, like pinch and exergy analyses. This work presents a methodological procedure for the analysis of different options of heat integration of a solar thermal and heat pump technologies in a tuna fish tinning process. The plant is located in a climatic zone where diffuse irradiation contributes more energy to the process than beam irradiation does. Pinch and exergy analyses are applied in the context of a low and middle temperatures, where the process demands big amounts of hot water and middle pressure steam. In order to recover internal heat, pinch analysis allows to understand the complexity of the heat exchange network of the process and to define thermal tendency objectives for energy optimization. Exergy analysis quantifies the variation that the quality of energy undergoes while it is used in the process according to the different way of integration. Both analytical tools, in combination with economical variables, provide a powerful methodological procedure finding the most favourable heat integration and, by this, they help in the technological decision making and in the design phase. - Highlights: ► Integration of solar thermal energy in batch canning process was assessed. ► Pinch and exergy analyses were used to determine the optimal energy supply configuration. ► Combination of heat pump and solar thermal energy improves the energy efficiency and reduces fossil fuel consumption

  15. Comparison Based on Exergetic Analyses of Two Hot Air Engines: A Gamma Type Stirling Engine and an Open Joule Cycle Ericsson Engine

    OpenAIRE

    Houda Hachem; Marie Creyx; Ramla Gheith; Eric Delacourt; Céline Morin; Fethi Aloui; Sassi Ben Nasrallah

    2015-01-01

    In this paper, a comparison of exergetic models between two hot air engines (a Gamma type Stirling prototype having a maximum output mechanical power of 500 W and an Ericsson hot air engine with a maximum power of 300 W) is made. Referring to previous energetic analyses, exergetic models are set up in order to quantify the exergy destruction and efficiencies in each type of engine. The repartition of the exergy fluxes in each part of the two engines are determined and represented in Sankey di...

  16. Age at Menarche, Level of Education, Parity and the Risk of Hysterectomy: A Systematic Review and Meta-Analyses of Population-Based Observational Studies.

    Directory of Open Access Journals (Sweden)

    Louise F Wilson

    Full Text Available Although rates have declined, hysterectomy is still a frequent gynaecological procedure. To date, there has been no systematic quantification of the relationships between early/mid-life exposures and hysterectomy. We performed a systematic review and meta-analyses to quantify the associations between age at menarche, education level, parity and hysterectomy.Eligible studies were identified by searches in PubMed and Embase through March 2015. Study-specific estimates were summarised using random effects meta-analysis. Heterogeneity was explored using sub-group analysis and meta-regression.Thirty-two study populations were identified for inclusion in at least one meta-analysis. Each year older at menarche was associated with lower risk of hysterectomy-summary hazard ratio 0.86 (95% confidence interval: 0.78, 0.95; I2 = 0%; summary odds ratio 0.88 (95% confidence interval: 0.82, 0.94; I2 = 61%. Low education levels conferred a higher risk of hysterectomy in the lowest versus highest level meta-analysis (summary hazard ratio 1.87 (95% confidence interval: 1.25, 2.80; I2 = 86%, summary odds ratio 1.51 (95% confidence interval: 1.35, 1.69; I2 = 90% and dose-response meta-analysis (summary odds ratio 1.17 (95% confidence interval: 1.12, 1.23; I2 = 85% per each level lower of education. Sub-group analysis showed that the birth cohort category of study participants, the reference category used for level of education, the year the included article was published, quality of the study (as assessed by the authors and control for the key variables accounted for the high heterogeneity between studies in the education level meta-analyses. In the meta-analyses of studies of parity and hysterectomy the results were not statistically significant.The present meta-analyses suggest that the early life factors of age at menarche and lower education level are associated with hysterectomy, although this evidence should be interpreted with some caution due to variance

  17. New ORIGEN2 libraries based on JENDL-4.0 and their validation for long-lived fission products by post irradiation examination analyses of LWR spent fuels

    International Nuclear Information System (INIS)

    Accurate inventory estimation of long-lived fission products (LLFPs) in LWR spent fuels is important for the quality management of high-level radioactive vitrified wastes and for long-term safety assessment of the vitrified wastes. In Japan, ORIGEN2 has been widely used to estimate the spent fuel compositions of light water reactors. However, equipped library data in the original ORIGEN2 code are old and are not validated enough for LLFPs, such as 79Se, 99Tc, 126Sn and 135Cs, because available post irradiation examination (PIE) data are limited for these nuclides, which have difficulties in radiochemical analyses compared with other fission products or actinides. For more accurate inventory estimation of the spent fuel composition, new ORIGEN2 libraries are developed from the latest Japanese nuclear data library JENDL-4.0 for neutron-induced cross sections and fission yields, and from other recent nuclear data libraries for decay half-lives, and so on. The new libraries are validated by PIE analyses of the sample fuels irradiated in Cooper (BWR), Calvert-Cliffs-1 (PWR) and H. B. Robinson-2 (PWR) in USA, and of a sample fuel in the recent Japanese PIE data from Ohi-1 (PWR). As a result, it was found that the new library gives good results for the inventory estimation of LLFPs. (author)

  18. THE VALIDITY OF USING ROC SOFTWARE FOR ANALYSING VISUAL GRADING CHARACTERISTICS DATA: AN INVESTIGATION BASED ON THE NOVEL SOFTWARE VGC ANALYZER.

    Science.gov (United States)

    Hansson, Jonny; Månsson, Lars Gunnar; Båth, Magnus

    2016-06-01

    The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUCVGC) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUCVGC For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. PMID:26979808

  19. Biomonitoring in a clean and a multi-contaminated estuary based on biomarkers and chemical analyses in the endobenthic worm Nereis diversicolor

    Energy Technology Data Exchange (ETDEWEB)

    Durou, Cyril [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France) and Institut de Biologie et Ecologie Appliquees, CEREA, Universite Catholique de l' Ouest, 44 rue Rabelais, 49008 Angers Cedex 01 (France)]. E-mail: cyril.durou@uco.fr; Poirier, Laurence [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Amiard, Jean-Claude [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Budzinski, Helene [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Gnassia-Barelli, Mauricette [UMR INRA UNSA 1112 ROSE, Faculte des Sciences, BP 71, 06108 Nice Cedex 2 (France); Lemenach, Karyn [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Peluhet, Laurent [CNRS UMR 5472, LPTC, Universite de Bordeaux I, 33405 Talence (France); Mouneyrac, Catherine [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France); Institut de Biologie et Ecologie Appliquees, CEREA, Universite Catholique de l' Ouest, 44 rue Rabelais, 49008 Angers Cedex 01 (France); Romeo, Michele [UMR INRA UNSA 1112 ROSE, Faculte des Sciences, BP 71, 06108 Nice Cedex 2 (France); Amiard-Triquet, Claude [CNRS, Universite de Nantes, Pole Mer et Littoral, SMAB, 2 rue de la Houssiniere, BP 92208, F-44322 Nantes Cedex 3 (France)

    2007-07-15

    Relationships between biochemical and physiological biomarkers (acetylcholinesterase [AChE], catalase, and glutathione S-transferase [GST] activities, thiobarbituric acid reactive substances, glycogen, lipids and proteins) and accumulated concentrations of contaminants (polychlorinated biphenyls [PCBs], polycyclic aromatic hydrocarbons and metals) were examined in the keystone species Nereis diversicolor. The chemical analyses of worms and sediments allowed the designation of the Seine estuary and the Authie estuary as a polluted and relatively clean site respectively. Worms from the Seine estuary exhibited higher GST and lower AChE activities. Generally, larger worms had higher concentrations of energy reserves. Principal component analyses clearly highlighted intersite differences: in the first plan, GST activities and chemical concentrations were inversely related to concentrations of energy reserves; in the second one, PCB concentrations and AChE activity were inversely related. Depleted levels of energy reserves could be a consequence of combating toxicants and might predict effects at higher levels of biological organization. The use of GST and AChE activities and energy reserve concentrations as biomarkers is validated in the field in this keystone species. - The use of N. diversicolor as a biomonitor of environmental quality via the measurement of biomarkers and accumulated concentrations of contaminants is validated in the field.

  20. Biomonitoring in a clean and a multi-contaminated estuary based on biomarkers and chemical analyses in the endobenthic worm Nereis diversicolor

    International Nuclear Information System (INIS)

    Relationships between biochemical and physiological biomarkers (acetylcholinesterase [AChE], catalase, and glutathione S-transferase [GST] activities, thiobarbituric acid reactive substances, glycogen, lipids and proteins) and accumulated concentrations of contaminants (polychlorinated biphenyls [PCBs], polycyclic aromatic hydrocarbons and metals) were examined in the keystone species Nereis diversicolor. The chemical analyses of worms and sediments allowed the designation of the Seine estuary and the Authie estuary as a polluted and relatively clean site respectively. Worms from the Seine estuary exhibited higher GST and lower AChE activities. Generally, larger worms had higher concentrations of energy reserves. Principal component analyses clearly highlighted intersite differences: in the first plan, GST activities and chemical concentrations were inversely related to concentrations of energy reserves; in the second one, PCB concentrations and AChE activity were inversely related. Depleted levels of energy reserves could be a consequence of combating toxicants and might predict effects at higher levels of biological organization. The use of GST and AChE activities and energy reserve concentrations as biomarkers is validated in the field in this keystone species. - The use of N. diversicolor as a biomonitor of environmental quality via the measurement of biomarkers and accumulated concentrations of contaminants is validated in the field

  1. Biomonitoring in a clean and a multi-contaminated estuary based on biomarkers and chemical analyses in the endobenthic worm Nereis diversicolor.

    Science.gov (United States)

    Durou, Cyril; Poirier, Laurence; Amiard, Jean-Claude; Budzinski, Hélène; Gnassia-Barelli, Mauricette; Lemenach, Karyn; Peluhet, Laurent; Mouneyrac, Catherine; Roméo, Michèle; Amiard-Triquet, Claude

    2007-07-01

    Relationships between biochemical and physiological biomarkers (acetylcholinesterase [AChE], catalase, and glutathione S-transferase [GST] activities, thiobarbituric acid reactive substances, glycogen, lipids and proteins) and accumulated concentrations of contaminants (polychlorinated biphenyls [PCBs], polycyclic aromatic hydrocarbons and metals) were examined in the keystone species Nereis diversicolor. The chemical analyses of worms and sediments allowed the designation of the Seine estuary and the Authie estuary as a polluted and relatively clean site respectively. Worms from the Seine estuary exhibited higher GST and lower AChE activities. Generally, larger worms had higher concentrations of energy reserves. Principal component analyses clearly highlighted intersite differences: in the first plan, GST activities and chemical concentrations were inversely related to concentrations of energy reserves; in the second one, PCB concentrations and AChE activity were inversely related. Depleted levels of energy reserves could be a consequence of combating toxicants and might predict effects at higher levels of biological organization. The use of GST and AChE activities and energy reserve concentrations as biomarkers is validated in the field in this keystone species. PMID:17289233

  2. The phylogenetic position of the Loimoidae Price, 1936 (Monogenoidea: Monocotylidea) based on analyses of partial rDNA sequences and morphological data.

    Science.gov (United States)

    Boeger, W A; Kritsky, D C; Domingues, M V; Bueno-Silva, M

    2014-06-01

    Phylogenetic analyses of partial sequences of 18S and 28S rDNA of some monogenoids, including monocotylids and a specimen of Loimosina sp. collected from a hammerhead shark off Brazil, indicated that the Loimoidae (as represented by the specimen of Loimosina sp.) represents an in-group taxon of the Monocotylidae. In all analyses, the Loimoidae fell within a major monocotylid clade including species of the Heterocotylinae, Decacotylinae, and Monocotylinae. The Loimoidae formed a terminal clade with two heterocotyline species, Troglocephalus rhinobatidis and Neoheterocotyle rhinobatis, for which it represented the sister taxon. The following morphological characters supported the clade comprising the Loimoidae, Heterocotylinae, Decacotylinae and Monocotylinae: single vagina present, presence of a narrow deep anchor root, and presence of a marginal haptoral membrane. The presence of cephalic pits was identified as a putative synapomorphy for the clade (Loimoidae (T. rhinobatidis, N. rhinobatis)). Although rDNA sequence data support the rejection of the Loimoidae and incorporating its species into the Monocotylidae, this action was not recommended pending a full phylogenetic analysis of morphological data. PMID:24491371

  3. Blood-based gene expression signatures of medication-free outpatients with major depressive disorder: integrative genome-wide and candidate gene analyses

    OpenAIRE

    Hiroaki Hori; Daimei Sasayama; Toshiya Teraishi; Noriko Yamamoto; Seiji Nakamura; Miho Ota; Kotaro Hattori; Yoshiharu Kim; Teruhiko Higuchi; Hiroshi Kunugi

    2016-01-01

    Several microarray-based studies have investigated gene expression profiles in major depressive disorder (MDD), yet with highly variable findings. We examined blood-based genome-wide expression signatures of MDD, focusing on molecular pathways and networks underlying differentially expressed genes (DEGs) and behaviours of hypothesis-driven, evidence-based candidate genes for depression. Agilent human whole-genome arrays were used to measure gene expression in 14 medication-free outpatients wi...

  4. Towards High-Quality Reflective Learning amongst Law Undergraduate Students: Analysing Students' Reflective Journals during a Problem-Based Learning Course

    Science.gov (United States)

    Rué, Joan; Font, Antoni; Cebrián, Gisela

    2013-01-01

    There is wide agreement that problem-based learning is a key strategy to promote individual abilities for "learning how to learn". This paper presents the main contributions that reflective journals and the problem-based learning approach can make to foster professional knowledge and quality learning in higher education. Thirty-six…

  5. Hydrologie du bassin de Marennes- Oléron. Analyse de la base de données "RAZLEC" 1977- 1995

    OpenAIRE

    Faury, Nicole; Razet, Daniel; Soletchnik, Patrick; GOULLETQUER Philippe; Ratiskol, Jacqueline; Garnier, Jacqueline

    1999-01-01

    Le bassin de Marennes-Oléron fait l'objet depuis 1977 d'un suivi hydrologique. Cinq stations réparties sur le bassin sont échantillonnées deux fois par mois. L'évolution des paramètres physiques (température, salinité, oxygène dissous et matières en suspension), des sels nutritifs (azote ammoniacal, nitrite, nitrate et silicate), de la chlorophylle a et des phéopigments est examinée au travers d'un test statistique (d'analyse de séries temporelles) qui décompose les données selon la variation...

  6. Head and neck tumours: combined MRI assessment based on IVIM and TIC analyses for the differentiation of tumors of different histological types

    International Nuclear Information System (INIS)

    We evaluated the combined use of intravoxel incoherent motion (IVIM) and time-signal intensity curve (TIC) analyses to diagnose head and neck tumours. We compared perfusion-related parameters (PP) and molecular diffusion values (D) determined from IVIM theory and TIC profiles among 92 tumours with different histologies. IVIM parameters (f and D values) and TIC profiles in combination were distinct among the different types of head and neck tumours, including squamous cell carcinomas (SCCs), lymphomas, malignant salivary gland tumours, Warthin's tumours, pleomorphic adenomas and schwannomas. A multiparametric approach using both IVIM parameters and TIC profiles differentiated between benign and malignant tumours with 97 % accuracy and diagnosed different tumour types with 89 % accuracy. Combined use of IVIM parameters and TIC profiles has high efficacy in diagnosing head and neck tumours. (orig.)

  7. Periodic safety analyses

    International Nuclear Information System (INIS)

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989

  8. Molecular- and cultivation-based analyses of microbial communities in oil field water and in microcosms amended with nitrate to control H{sub 2}S production

    Energy Technology Data Exchange (ETDEWEB)

    Kumaraswamy, Raji; Ebert, Sara; Fedorak, Phillip M.; Foght, Julia M. [Alberta Univ., Edmonton, AB (Canada). Biological Sciences; Gray, Murray R. [Alberta Univ., Edmonton, AB (Canada). Chemical and Materials Engineering

    2011-03-15

    Nitrate injection into oil fields is an alternative to biocide addition for controlling sulfide production ('souring') caused by sulfate-reducing bacteria (SRB). This study examined the suitability of several cultivation-dependent and cultivation-independent methods to assess potential microbial activities (sulfidogenesis and nitrate reduction) and the impact of nitrate amendment on oil field microbiota. Microcosms containing produced waters from two Western Canadian oil fields exhibited sulfidogenesis that was inhibited by nitrate amendment. Most probable number (MPN) and fluorescent in situ hybridization (FISH) analyses of uncultivated produced waters showed low cell numbers ({<=}10{sup 3} MPN/ml) dominated by SRB (>95% relative abundance). MPN analysis also detected nitrate-reducing sulfide-oxidizing bacteria (NRSOB) and heterotrophic nitrate-reducing bacteria (HNRB) at numbers too low to be detected by FISH or denaturing gradient gel electrophoresis (DGGE). In microcosms containing produced water fortified with sulfate, near-stoichiometric concentrations of sulfide were produced. FISH analyses of the microcosms after 55 days of incubation revealed that Gammaproteobacteria increased from undetectable levels to 5-20% abundance, resulting in a decreased proportion of Deltaproteobacteria (50-60% abundance). DGGE analysis confirmed the presence of Delta- and Gammaproteobacteria and also detected Bacteroidetes. When sulfate-fortified produced waters were amended with nitrate, sulfidogenesis was inhibited and Deltaproteobacteria decreased to levels undetectable by FISH, with a concomitant increase in Gammaproteobacteria from below detection to 50-60% abundance. DGGE analysis of these microcosms yielded sequences of Gamma- and Epsilonproteobacteria related to presumptive HNRB and NRSOB (Halomonas, Marinobacterium, Marinobacter, Pseudomonas and Arcobacter), thus supporting chemical data indicating that nitrate-reducing bacteria out-compete SRB when nitrate is

  9. Determinants of the over-anticoagulation response during warfarin initiation therapy in Asian patients based on population pharmacokinetic-pharmacodynamic analyses.

    Directory of Open Access Journals (Sweden)

    Minami Ohara

    Full Text Available To clarify pharmacokinetic-pharmacodynamic (PK-PD factors associated with the over-anticoagulation response in Asians during warfarin induction therapy, population PK-PD analyses were conducted in an attempt to predict the time-courses of the plasma S-warfarin concentration, Cp(S, and coagulation and anti-coagulation (INR responses. In 99 Chinese patients we analyzed the relationships between dose and Cp(S to estimate the clearance of S-warfarin, CL(S, and that between Cp(S and the normal prothrombin concentration (NPT as a coagulation marker for estimation of IC50. We also analyzed the non-linear relationship between NPT inhibition and the increase in INR to derive the non-linear index λ. Population analyses accurately predicted the time-courses of Cp(S, NPT and INR. Multivariate analysis showed that CYP2C9*3 mutation and body surface area were predictors of CL(S, that VKORC1 and CYP4F2 polymorphisms were predictors of IC50, and that baseline NPT was a predictor of λ. CL(S and λ were significantly lower in patients with INR≥4 than in those with INR<4 (190 mL/h vs 265 mL/h, P<0.01 and 3.2 vs 3.7, P<0.01, respectively. Finally, logistic regression analysis revealed that CL(S, ALT and hypertension contributed significantly to INR≥4. All these results indicate that factors associated with the reduced metabolic activity of warfarin represented by CL(S, might be critical determinants of the over-anticoagulation response during warfarin initiation in Asians.ClinicalTrials.gov NCT02065388.

  10. Report sensory analyses veal

    OpenAIRE

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull, pink veal and white veal. The sensory descriptive analyses show that the three groups Young bulls, pink veal and white veal, differ significantly in red colour for the raw meat as well as the baked...

  11. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  12. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  13. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    International Nuclear Information System (INIS)

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  14. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masami [RIST, Tokyo University of Science, Noda, Chiba (Japan); Sunaguchi, Naoki [Gunma University, Graduate School of Engineering, Kiryu, Gunma (Japan); Wu, Yanlin [The Graduate University for Advanced Studies, Department of Materials Structure Science, School of High Energy Accelerator Science, Tsukuba, Ibaraki (Japan); Do, Synho; Sung, Yongjin; Gupta, Rajiv [Massachusetts General Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States); Louissaint, Abner [Massachusetts General Hospital and Harvard Medical School, Department of Pathology, Boston, MA (United States); Yuasa, Tetsuya [Yamagata University, Faculty of Engineering, Yonezawa, Yamagata (Japan); Ichihara, Shu [Nagoya Medical Center, Department of Pathology, Nagoya, Aichi (Japan)

    2014-02-15

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  15. Identification of Distinct Breast Cancer Stem Cell Populations Based on Single-Cell Analyses of Functionally Enriched Stem and Progenitor Pools

    Directory of Open Access Journals (Sweden)

    Nina Akrap

    2016-01-01

    Full Text Available The identification of breast cancer cell subpopulations featuring truly malignant stem cell qualities is a challenge due to the complexity of the disease and lack of general markers. By combining extensive single-cell gene expression profiling with three functional strategies for cancer stem cell enrichment including anchorage-independent culture, hypoxia, and analyses of low-proliferative, label-retaining cells derived from mammospheres, we identified distinct stem cell clusters in breast cancer. Estrogen receptor (ERα+ tumors featured a clear hierarchical organization with switch-like and gradual transitions between different clusters, illustrating how breast cancer cells transfer between discrete differentiation states in a sequential manner. ERα− breast cancer showed less prominent clustering but shared a quiescent cancer stem cell pool with ERα+ cancer. The cellular organization model was supported by single-cell data from primary tumors. The findings allow us to understand the organization of breast cancers at the single-cell level, thereby permitting better identification and targeting of cancer stem cells.

  16. Comparison Based on Exergetic Analyses of Two Hot Air Engines: A Gamma Type Stirling Engine and an Open Joule Cycle Ericsson Engine

    Directory of Open Access Journals (Sweden)

    Houda Hachem

    2015-10-01

    Full Text Available In this paper, a comparison of exergetic models between two hot air engines (a Gamma type Stirling prototype having a maximum output mechanical power of 500 W and an Ericsson hot air engine with a maximum power of 300 W is made. Referring to previous energetic analyses, exergetic models are set up in order to quantify the exergy destruction and efficiencies in each type of engine. The repartition of the exergy fluxes in each part of the two engines are determined and represented in Sankey diagrams, using dimensionless exergy fluxes. The results show a similar proportion in both engines of destroyed exergy compared to the exergy flux from the hot source. The compression cylinders generate the highest exergy destruction, whereas the expansion cylinders generate the lowest one. The regenerator of the Stirling engine increases the exergy resource at the inlet of the expansion cylinder, which might be also set up in the Ericsson engine, using a preheater between the exhaust air and the compressed air transferred to the hot heat exchanger.

  17. Prospective separation and transcriptome analyses of cortical projection neurons and interneurons based on lineage tracing by Tbr2 (Eomes)-GFP/Dcx-mRFP reporters.

    Science.gov (United States)

    Liu, Jiancheng; Wu, Xiwei; Zhang, Heying; Qiu, Runxiang; Yoshikawa, Kazuaki; Lu, Qiang

    2016-06-01

    In the cerebral cortex, projection neurons and interneurons work coordinately to establish neural networks for normal cortical functions. While the specific mechanisms that control productions of projection neurons and interneurons are beginning to be revealed, a global characterization of the molecular differences between these two neuron types is crucial for a more comprehensive understanding of their developmental specifications and functions. In this study, using lineage tracing power of combining Tbr2(Eomes)-GFP and Dcx-mRFP reporter mice, we prospectively separated intermediate progenitor cell (IPC)-derived neurons (IPNs) from non-IPC-derived neurons (non-IPNs) of the embryonic cerebral cortex. Molecular characterizations revealed that IPNs and non-IPNs were enriched with projection neurons and interneurons, respectively. Expression profiling documented cell-specific genes including differentially expressed transcriptional regulators that might be involved in cellular specifications, for instance, our data found that SOX1 and SOX2, which were known for important functions in neural stem/progenitor cells, continued to be expressed by interneurons but not by projection neurons. Transcriptome analyses of cortical neurons isolated at different stages of neurogenesis revealed distinct temporal patterns of expression of genes involved in early-born or late-born neuron specification. These data present a resource useful for further investigation of the molecular regulations and functions of projection neurons and interneurons. © 2015 Wiley Periodicals, Inc. Develop Neurobiol 76: 587-599, 2016. PMID:26248544

  18. Standardized Software for Wind Load Forecast Error Analyses and Predictions Based on Wavelet-ARIMA Models - Applications at Multiple Geographically Distributed Wind Farms

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Zhangshuan; Makarov, Yuri V.; Samaan, Nader A.; Etingov, Pavel V.

    2013-03-19

    Given the multi-scale variability and uncertainty of wind generation and forecast errors, it is a natural choice to use time-frequency representation (TFR) as a view of the corresponding time series represented over both time and frequency. Here we use wavelet transform (WT) to expand the signal in terms of wavelet functions which are localized in both time and frequency. Each WT component is more stationary and has consistent auto-correlation pattern. We combined wavelet analyses with time series forecast approaches such as ARIMA, and tested the approach at three different wind farms located far away from each other. The prediction capability is satisfactory -- the day-ahead prediction of errors match the original error values very well, including the patterns. The observations are well located within the predictive intervals. Integrating our wavelet-ARIMA (‘stochastic’) model with the weather forecast model (‘deterministic’) will improve our ability significantly to predict wind power generation and reduce predictive uncertainty.

  19. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  20. A STUDY TO ANALYSE THE EFFICACY OF MODIFIED PILATES BASED EXERCISES AND THERAPEUTIC EXERCISES IN INDIVIDUALS WITH CHRONIC NON SPECIFIC LOW BACK PAIN: A RANDOMIZED CONTROLLED TRAIL

    Directory of Open Access Journals (Sweden)

    U.Albert Anand,

    2014-06-01

    Full Text Available Background: Chronic low back pain is an expensive and difficult condition to treat. Low back pain is the most common musculoskeletal symptoms seen in 85 % of individuals in their life time. One of the interventions widely used by physiotherapists in the treatment of chronic non-specific low back pain (CNLBP is exercise therapy based upon the Pilates principles. Objective: The purpose of the study was to find out the effect of Modified Pilates based exercises for patients with Chronic Non Specific Low Back Pain. Design: A randomized controlled trial, pre test-post test design Setting: The study was conducted in Out Patient Department of physiotherapy, K.G Hospital, Coimbatore, India. Patients: Fifty– two physically active subjects between 18 – 60 years old with Chronic Non specific Low Specific Pain of more than 12 weeks’ duration were randomly assigned into 2 groups. Interventions: Group A subjects underwent a Modified specific Pilates based exercises with Flexibility Exercises & Group B Subjects underwent a Therapeutic Exercises with Flexibility Exercises were conducted over of 8 weeks. Measurements: Back specific Functional Status outcome were measured with the Oswestry Disability Index and pain intensity were measured with Visual analogue scale. Conclusion: The study concluded that the Modified specific Pilates based exercises helps in reducing the pain, improve the back specific function, improve general health, personal Care, Social Life and flexibility in individuals with non specific chronic low back pain than the therapeutic exercise group.

  1. Spectroscopic analyses on interaction of Amantadine-Salicylaldehyde, Amantadine-5-Chloro-Salicylaldehyde and Amantadine-o-Vanillin Schiff-Bases with bovine serum albumin (BSA)

    Science.gov (United States)

    Wang, Zhiqiu; Gao, Jingqun; Wang, Jun; Jin, Xudong; Zou, Mingming; Li, Kai; Kang, Pingli

    2011-12-01

    In this work, three Tricyclo [3.3.1.1(3,7)] decane-1-amine (Amantadine) Schiff-Bases, Amantadine-Salicylaldehyde (AS), Amantadine-5-Chloro-Salicylaldehyde (AS-5-C) and Amantadine-o-Vanillin (AS-o-V), were synthesized by direct heating reflux method in ethanol solution and characterized by infrared spectrum and elementary analysis. Fluorescence quenching was used to study the interaction of these Amantadine Schiff-Bases (AS, AS-5-C and AS-o-V) with bovine serum albumin (BSA). According to fluorescence quenching calculations the bimolecular quenching constant ( Kq), apparent quenching constant ( KSV), effective binding constant ( KA) and corresponding dissociation constant ( KD), binding site number ( n) and binding distance ( r) were obtained. The results show that these Amantadine Schiff-Bases can obviously bind to BSA molecules and the binding strength order is AS < AS-5-C = AS-o-V. Synchronous fluorescence spectroscopy reveals that these Amantadine Schiff-Bases adopt different way to bind with BSA molecules. That is, the AS and AS-5-C are accessibility to tryptophan (Trp) residues more than the tyrosine (Tyr) residues, while the AS-o-V is equally close to the Tyr and Trp residues.

  2. BiodivERsA project VineDivers: Analysing interlinkages between soil biota and biodiversity-based ecosystem services in vineyards across Europe

    Science.gov (United States)

    Zaller, Johann G.; Winter, Silvia; Strauss, Peter; Querner, Pascal; Kriechbaum, Monika; Pachinger, Bärbel; Gómez, José A.; Campos, Mercedes; Landa, Blanca; Popescu, Daniela; Comsa, Maria; Iliescu, Maria; Tomoiaga, Liliana; Bunea, Claudiu-Ioan; Hoble, Adela; Marghitas, Liviu; Rusu, Teodor; Lora, Ángel; Guzmán, Gema; Bergmann, Holger

    2015-04-01

    Essential ecosystem services provided by viticultural landscapes result from diverse communities of above- and belowground organisms and their interactions. For centuries traditional viticulture was part of a multifunctional agricultural system including low-input grasslands and fruit trees resulting in a high functional biodiversity. However, in the last decades intensification and mechanisation of vineyard management caused a separation of production and conservation areas. As a result of management intensification including frequent tilling and/or use of pesticides several ecosystem services are affected leading to high rates of soil erosion, degradation of soil structure and fertility, contamination of groundwater and high levels of agricultural inputs. In this transdisciplinary BiodivERsA project we will examine to what extent differently intensive managed vineyards affect the activity and diversity of soil biota (e.g. earthworms, collembola, soil microorganisms) and how this feed back on aboveground biodiversity (e.g. weeds, pollinators). We will also investigate ecosystem services associated with soil faunal activity and biodiversity such as soil structure, the formation of stable soil aggregates, water infiltration, soil erosion as well as grape quality. These effects will become increasingly important as more extreme precipitation events are predicted with climate change. The socio-economic part of the project will investigate the role of diversely structured, species-rich viticultural landscapes as a cultural heritage providing aesthetic values for human well-being and recreation. The project objectives will be analysed at plot, field (vineyard) and landscape scales in vineyards located in Spain, France, Romania and Austria. A detailed engagement and dissemination plan for stakeholder at the different governance levels will accompany scientific research and will contribute to the implementation of best-practice recommendations for policy and farmers.

  3. Analysing the Concepts of Vengeance and Hono(ur in Shakespeare´s Hamlet and Sumarokov´s Gamlet: A Corpus-based Approach to Literature

    Directory of Open Access Journals (Sweden)

    Irina Keshabyan

    2009-12-01

    Full Text Available The present paper aims at carrying out structural and lexical analysis of two contrasting plays –Shakespeare´s Hamlet and Sumarokov´s Gamlet- in a specific linguistic domain. In this contribution, we will attempt to gain some insight into two essential content words: vengeance and hono(ur, their derivatives and related words, through quantitative analysis of these words and qualitative analysis of their collocates and concordances. Collocational approach will be used to analyse and compare the ways the authors perceive the concepts of vengeance and hono(ur. In general, the findings will indicate important similarities and/or differences between the structures of the plays per acts and both texts´ basic contents in relation to two important topics -vengeance and hono(ur.El presente artículo tiene como objetivo un análisis estructural y léxico de dos obras contrastivas –Hamlet de Shakespeare y Gamlet de Sumarokov- en un dominio lingüístico específico. En esta contribución, intentaremos adentrarse en el estudio de dos sustantivos: venganza y honor, sus derivados y palabras relacionadas- a través de un análisis cuantitativo de las mismas y el análisis cualitativo de sus colocados y concordancias. El método de los colocados será utilizado para analizar y comparar el modo en que los autores perciben los conceptos de venganza y honor. En general, los resultados van a señalar las similitudes y/o diferencias importantes entre las estructuras de las obras por actos y los contenidos básicos de ambos textos en relación con dos temas importantes, tal como, venganza y honor.

  4. Genome-wide analyses of exonic copy number variants in a family-based study point to novel autism susceptibility genes.

    Directory of Open Access Journals (Sweden)

    Maja Bucan

    2009-06-01

    Full Text Available The genetics underlying the autism spectrum disorders (ASDs is complex and remains poorly understood. Previous work has demonstrated an important role for structural variation in a subset of cases, but has lacked the resolution necessary to move beyond detection of large regions of potential interest to identification of individual genes. To pinpoint genes likely to contribute to ASD etiology, we performed high density genotyping in 912 multiplex families from the Autism Genetics Resource Exchange (AGRE collection and contrasted results to those obtained for 1,488 healthy controls. Through prioritization of exonic deletions (eDels, exonic duplications (eDups, and whole gene duplication events (gDups, we identified more than 150 loci harboring rare variants in multiple unrelated probands, but no controls. Importantly, 27 of these were confirmed on examination of an independent replication cohort comprised of 859 cases and an additional 1,051 controls. Rare variants at known loci, including exonic deletions at NRXN1 and whole gene duplications encompassing UBE3A and several other genes in the 15q11-q13 region, were observed in the course of these analyses. Strong support was likewise observed for previously unreported genes such as BZRAP1, an adaptor molecule known to regulate synaptic transmission, with eDels or eDups observed in twelve unrelated cases but no controls (p = 2.3x10(-5. Less is known about MDGA2, likewise observed to be case-specific (p = 1.3x10(-4. But, it is notable that the encoded protein shows an unexpectedly high similarity to Contactin 4 (BLAST E-value = 3x10(-39, which has also been linked to disease. That hundreds of distinct rare variants were each seen only once further highlights complexity in the ASDs and points to the continued need for larger cohorts.

  5. Studies on changes of elemental concentration in a humanbody by means of analyses of long hairs based on the standard-free method

    International Nuclear Information System (INIS)

    We established and reported two years ago the original methods for evaluating daily changes of elemental concentration in a body by means of a standard-free method for powdered beard samples daily taken with electric shaver. It was found that the method is quite useful for investigating short- and long-term changes of elemental concentration in a body. However, the method for beard analysis is applicable only to men. In order to estimate daily changes of elemental concentration in a body for women and children, a new method which allows us to quantitatively analyze hair samples cut into 1 mm pieces has been developed and applied to three long hair samples taken from three persons. It is found that the method enables us to estimate both long- and short-term changes in elemental concentration in a body a well as beard analysis. It is found that sulfur keeps almost constant over a long period, and arsenic shows very rapid changes with a few days' period, while mercury shows only long-term changes with the period of a few months. These behaviors are almost the same as those observed in beard analyses. On the other hand, bromine shows a certain seasonal changes; its concentration shows a certain trend of increasing in summer and decreasing in winter. Lead and calcium show very long-term changes, and the behavior of strontium is quite similar to that of calcium. The method is expected to give us information about history of changes in elemental concentration in a human body over a few or more years. It is expected that the behavior of arsenic showing rapid elevation within a few days can be explained as a response to intakes of arsenic-rich foods. It is expected that the method gives us a clue to the identification of the main pathways of human exposure to certain toxic elements. (author)

  6. Possible future HERA analyses

    CERN Document Server

    Geiser, Achim

    2015-01-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...

  7. The CAMAC logic state analyser

    CERN Document Server

    Centro, Sandro

    1981-01-01

    Summary form only given, as follows. Large electronic experiments using distributed processors for parallel readout and data reduction need to analyse the data acquisition components status and monitor dead time constants of each active readout module and processor. For the UA1 experiment, a microprocessor-based CAMAC logic status analyser (CLSA) has been developed in order to implement these functions autonomously. CLSA is a single unit CAMAC module, able to record, up to 256 times, the logic status of 32 TTL inputs gated by a common clock, internal or external, with a maximum frequency of 2 MHz. The data stored in the internal CLSA memory can be read directly via CAMAC function or preprocessed by CLSA 6800 microprocessor. The 6800 resident firmware (4Kbyte) expands the module features to include an interactive monitor, data recording control, data reduction and histogram accumulation with statistics parameter evaluation. The microprocessor memory and the resident firmware can be externally extended using st...

  8. Statistisk analyse med SPSS

    OpenAIRE

    Linnerud, Kristin; Oklevik, Ove; Slettvold, Harald

    2004-01-01

    Dette notatet har sitt utspring i forelesninger og undervisning for 3.års studenter i økonomi og administrasjon ved høgskolen i Sogn og Fjordane. Notatet er særlig lagt opp mot undervisningen i SPSS i de to kursene ”OR 685 Marknadsanalyse og merkevarestrategi” og ”BD 616 Økonomistyring og analyse med programvare”.

  9. Possible future HERA analyses

    Energy Technology Data Exchange (ETDEWEB)

    Geiser, Achim

    2015-12-15

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  10. A conceptual framework for formulating a focused and cost-effective fire protection program based on analyses of risk and the dynamics of fire effects

    International Nuclear Information System (INIS)

    This paper proposes a conceptual framework for developing a fire protection program at nuclear power plants based on probabilistic risk analysis (PRA) of fire hazards, and modeling the dynamics of fire effects. The process for categorizing nuclear power plant fire areas based on risk is described, followed by a discussion of fire safety design methods that can be used for different areas of the plant, depending on the degree of threat to plant safety from the fire hazard. This alternative framework has the potential to make programs more cost-effective, and comprehensive, since it will allow a more systematic and broader examination of fire risk, and provide a means to distinguish between high and low risk fire contributors. (orig.)

  11. A phase separation method for analyses of fluoroquinones in meats based on ultrasound-assisted salt-induced liquid-liquid microextraction and a new integrated device

    OpenAIRE

    Wang, H; Gao, M.; Xu, Y; W. Wang; Zheng, L; Dahlgren, RA; Wang, X.

    2015-01-01

    © 2015 Elsevier Ltd. Herein, we developed a novel integrated device to perform phase separation based on ultrasound-assisted, salt-induced, liquid-liquid microextraction for determination of five fluoroquinones in meats by HPLC analysis. The novel integrated device consisted of three simple HDPE (high density polyethylene) parts that were used to separate the solvent from the aqueous solution prior to retrieving the extractant. The extraction parameters were optimized using the response surfa...

  12. An overview of the higher level classification of Pucciniomycotina based on combined analyses of nuclear large and small subunit rDNA sequences

    OpenAIRE

    Aime, M. Catherine; Matheny, P. Brandon; Henk, Daniel. A.; Frieders, Elizabeth M.; Nilsson, R. Henrik; Piepenbring, Meike; McLaughlin, David J.; Szabo, Les J.; Begerow, Dominik; Sampaio, José Paulo; Bauer, Robert; Weiß, Michael; Oberwinkler, Franz; Hibbett, David

    2006-01-01

    In this study we provide a phylogenetically based introduction to the classes and orders of Pucciniomycotina (5Urediniomycetes), one of three subphyla of Basidiomycota. More than 8000 species of Pucciniomycotina have been described including putative saprotrophs and parasites of plants, animals and fungi. The overwhelming majority of these(,90%) belong to a single order of obligate plant pathogens, the Pucciniales (5Uredinales), or rust fungi. We have assembled a dataset of previously p...

  13. Net Energy, CO 2 Emission and Land-Based Cost-Benefit Analyses of Jatropha Biodiesel: A Case Study of the Panzhihua Region of Sichuan Province in China

    OpenAIRE

    Xiangzheng Deng; Jianzhi Han; Fang Yin

    2012-01-01

    Bioenergy is currently regarded as a renewable energy source with a high growth potential. Forest-based biodiesel, with the significant advantage of not competing with grain production on cultivated land, has been considered as a promising substitute for diesel fuel by many countries, including China. Consequently, extracting biodiesel from Jatropha curcas has become a growing industry. However, many key issues related to the development of this industry are still not fully resolved and the p...

  14. Analysing Individual Income Tax Planning in Colleges and Universities---Based on the Normal Wage Income and Annual Salary of One-time Bonus Tax

    Institute of Scientific and Technical Information of China (English)

    LI Xiao-hua

    2013-01-01

    Paying personal income tax is related to the vital interests of each faculty. In order to improve the faculty’s enthusiasm for work, it's very essential to make a plan for paying personal income tax.Based on the method of paying personal income tax, this essay gives a strategic analysis of tax planning with a combination of the university faculty’s actual salary situation.

  15. Net Energy, CO 2 Emission and Land-Based Cost-Benefit Analyses of Jatropha Biodiesel: A Case Study of the Panzhihua Region of Sichuan Province in China

    OpenAIRE

    Xiangzheng Deng; Jianzhi Han; Fang Yin

    2012-01-01

    Bioenergy is currently regarded as a renewable energy source with a high growth potential. Forest-based biodiesel, with the significant advantage of not competing with grain production on cultivated land, has been considered as a promising substitute for diesel fuel by many countries, including China. Consequently, extracting biodiesel from Jatropha curcas has become a growing industry. However, many key issues related to the development of this indus...

  16. A DNA-based method for identification of krill species and its application to analysing the diet of marine vertebrate predators.

    Science.gov (United States)

    Jarman, S N; Gales, N J; Tierney, M; Gill, P C; Elliott, N G

    2002-12-01

    Accurate identification of species that are consumed by vertebrate predators is necessary for understanding marine food webs. Morphological methods for identifying prey components after consumption often fail to make accurate identifications of invertebrates because prey morphology becomes damaged during capture, ingestion and digestion. Another disadvantage of morphological methods for prey identification is that they often involve sampling procedures that are disruptive for the predator, such as stomach flushing or lethal collection. We have developed a DNA-based method for identifying species of krill (Crustacea: Malacostraca), an enormously abundant group of invertebrates that are directly consumed by many groups of marine vertebrates. The DNA-based approach allows identification of krill species present in samples of vertebrate stomach contents, vomit, and, more importantly, faeces. Utilizing samples of faeces from vertebrate predators minimizes the impact of dietary studies on the subject animals. We demonstrate our method first on samples of Adelie penguin (Pygoscelis adeliae) stomach contents, where DNA-based species identification can be confirmed by prey morphology. We then apply the method to faeces of Adelie penguins and to faeces of the endangered pygmy blue whale (Balaenoptera musculus brevicauda). In each of these cases, krill species consumed by the predators could be identified from their DNA present in faeces or stomach contents. PMID:12453250

  17. Energy, exergy and sustainability analyses of hybrid renewable energy based hydrogen and electricity production and storage systems: Modeling and case study

    International Nuclear Information System (INIS)

    In this study, hybrid renewable energy based hydrogen and electricity production and storage systems are conceptually modeled and analyzed in detail through energy, exergy and sustainability approaches. Several subsystems, namely hybrid geothermal energy-wind turbine-solar photovoltaic (PV) panel, inverter, electrolyzer, hydrogen storage system, Proton Exchange Membrane Fuel Cell (PEMFC), battery and loading system are considered. Also, a case study, based on hybrid wind–solar renewable energy system, is conducted and its results are presented. In addition, the dead state temperatures are considered as 0 °C, 10 °C, 20 °C and 30 °C, while the environment temperature is 30 °C. The maximum efficiencies of the wind turbine, solar PV panel, electrolyzer, PEMFC are calculated as 26.15%, 9.06%, 53.55%, and 33.06% through energy analysis, and 71.70%, 9.74%, 53.60%, and 33.02% through exergy analysis, respectively. Also, the overall exergy efficiency, ranging from 5.838% to 5.865%, is directly proportional to the dead state temperature and becomes higher than the corresponding energy efficiency of 3.44% for the entire system. -- Highlights: ► Developing a three-hybrid renewable energy (geothermal–wind–solar)-based system. ► Undertaking a parametric study at various dead state temperatures. ► Investigating the effect of dead state temperatures on exergy efficiency

  18. Distinct summer and winter bacterial communities in the active layer of Svalbard permafrost revealed by DNA- and RNA-based analyses

    Energy Technology Data Exchange (ETDEWEB)

    Schostag, Morten; Stibal, Marek; Jacobsen, Carsten S.; Baelum, Jacob; Tas, Neslihan; Elberling, Bo; Jansson, Janet K.; Semenchuk, Phillip; Prieme, Anders

    2015-04-30

    The active layer of soil overlaying permafrost in the Arctic is subjected to dramatic annual changes in temperature and soil chemistry, which likely affect bacterial activity and community structure. We studied seasonal variations in the bacterial community of active layer soil from Svalbard (78°N) by co-extracting DNA and RNA from 12 soil cores collected monthly over a year. PCR amplicons of 16S rRNA genes (DNA) and reverse transcribed transcripts (cDNA) were quantified and sequenced to test for the effect of low winter temperature and seasonal variation in concentration of easily degradable organic matter on the bacterial communities. The copy number of 16S rRNA genes and transcripts revealed no distinct seasonal changes indicating potential bacterial activity during winter despite soil temperatures well below -10°C. Multivariate statistical analysis of the bacterial diversity data (DNA and cDNA libraries) revealed a season-based clustering of the samples, and, e.g., the relative abundance of potentially active Cyanobacteria peaked in June and Alphaproteobacteria increased over the summer and then declined from October to November. The structure of the bulk (DNA-based) community was significantly correlated with pH and dissolved organic carbon, while the potentially active (RNA-based) community structure was not significantly correlated with any of the measured soil parameters. A large fraction of the 16S rRNA transcripts was assigned to nitrogen-fixing bacteria (up to 24% in June) and phototrophic organisms (up to 48% in June) illustrating the potential importance of nitrogen fixation in otherwise nitrogen poor Arctic ecosystems and of phototrophic bacterial activity on the soil surface.

  19. Analyse, Modélisation et Simulation des Pertes dans un Module Photovoltaïque à Base de Silicium Monocristallin

    OpenAIRE

    OULED SALEM, Mohamed

    2010-01-01

    Les modules photovoltaïques perdent environ 70% de l'énergie qu'ils reçoivent sur leur surface. plusieurs paramètres peuvent entrainer des pertes qui sont principalement causées par les propriétés physiques des matériaux de base du modules, comme l'énergie du gap, indice de réfraction, ou par les matériaux utilisés dans l'encapsulation et le groupement des cellules dans le module afin, de permettre leur utilisation à des tensions et courants pratiques tout en assurant leur isolation électriqu...

  20. Static and free vibration analyses and dynamic control of composite plates integrated with piezoelectric sensors and actuators by the cell-based smoothed discrete shear gap method (CS-FEM-DSG3)

    International Nuclear Information System (INIS)

    The cell-based smoothed discrete shear gap method (CS-FEM-DSG3) using three-node triangular elements was recently proposed to improve the performance of the discrete shear gap method (DSG3) for static and free vibration analyses of isotropic Mindlin plates. In this paper, the CS-FEM-DSG3 is further extended for static and free vibration analyses and dynamic control of composite plates integrated with piezoelectric sensors and actuators. In the piezoelectric composite plates, the electric potential is assumed to be a linear function through the thickness of each piezoelectric sublayer. A displacement and velocity feedback control algorithm is used for active control of the static deflection and the dynamic response of the plates through closed loop control with bonded or embedded distributed piezoelectric sensors and actuators. The accuracy and reliability of the proposed method is verified by comparing its numerical solutions with those of other available numerical results. (paper)

  1. A phase separation method for analyses of fluoroquinones in meats based on ultrasound-assisted salt-induced liquid-liquid microextraction and a new integrated device.

    Science.gov (United States)

    Wang, Huili; Gao, Ming; Xu, Youqu; Wang, Wenwei; Zheng, Lian; Dahlgren, Randy A; Wang, Xuedong

    2015-08-01

    Herein, we developed a novel integrated device to perform phase separation based on ultrasound-assisted, salt-induced, liquid-liquid microextraction for determination of five fluoroquinones in meats by HPLC analysis. The novel integrated device consisted of three simple HDPE (high density polyethylene) parts that were used to separate the solvent from the aqueous solution prior to retrieving the extractant. The extraction parameters were optimized using the response surface method based on central composite design: 589μL of acetone solvent, pH2.1, 4.1min extraction time and 3.5g of Na2SO4. The limits of detection were 0.056-0.64 μgkg(-1) and recoveries were 87.2-110.6% for the five fluoroquinones in muscle tissue from fish, chicken, pork and beef. This method is easily constructed from inexpensive materials, extraction efficiency is high, and the approach is compatible with HPLC analysis. Thus, it has excellent prospects for sample pre-treatment and analysis of fluoroquinones in meat samples. PMID:25885797

  2. A model-based approach for current voltage analyses to quantify degradation and fuel distribution in solid oxide fuel cell stacks

    Science.gov (United States)

    Linder, Markus; Hocker, Thomas; Meier, Christoph; Holzer, Lorenz; Friedrich, K. Andreas; Iwanschitz, Boris; Mai, Andreas; Schuler, J. Andreas

    2015-08-01

    Reliable quantification and thorough interpretation of the degradation of solid oxide fuel cell (SOFC) stacks under real conditions is critical for the improvement of its long-term stability. The degradation behavior is often analyzed based on the evolution of current-voltage (V,I) curves. However, these overall resistances often contain unavoidable fluctuations in the fuel gas amount and composition and hence are difficult to interpret. Studying the evolution of internal repeat unit (RU) resistances is a more appropriate measure to assess stack degradation. RU-resistances follow from EIS-data through subtraction of the gas concentration impedance from the overall steady-state resistance. In this work a model-based approach where a local equilibrium model is used for spatial discretization of a SOFC stack RU running on hydrocarbon mixtures such as natural gas. Since under stack operation, fuel leakages, uneven fuel distribution and varying natural gas composition can influence the performance, they are taken into account by the model. The model extracts the time-dependent internal resistance from (V,I)-data and local species concentration without any fitting parameters. RU resistances can be compared with the sum of the resistances of different components that allows one to make links between laboratory degradation experiments and the behavior of SOFC stacks during operation.

  3. Estimation of geological structure around underground tunnel based on cross-correlation analyses of random continuous signals from small scale core drilling

    International Nuclear Information System (INIS)

    We have studied a reflection imaging technique in which passive seismic signals from other construction activity such as drilling noise are used as a source signal in order to develop a low-cost measurement method inside underground gallery. In this paper, we show an application to data sets from the Mizunami Underground Research Laboratory of Japan Atomic Energy Agency. The drilling was conducted with a main purpose of extraction rock core samples for rock mechanics research. The three-dimensional reflection imaging results and comparison with borehole observation are shown in this paper. Firstly, we show a concept and advantages of utilization of drilling noise inside narrow underground gallery, and explain analytical technique based on correlation analysis of three-component waveforms by which we can theoretically obtain three-dimensional imaging results even with only one single point observation. Next, we describe an outline of the data acquisition at Mizunami, and characteristics of observed drilling noise waveforms. Then, we show results of reflection imaging by using drilling noise signals, and compare those with fractures based on borehole observations. We discuss relationship between the estimated three-dimensional reflection images and possible spatial distribution of fractures. These results infer that the proposed technique can be useful tool when we need additional geological information in order to achieve maximum safety at reasonable low-cost in case at a narrow space such as for a high level radioactive waste repository development. (author)

  4. A 100% renewable electricity mix? Analyses and optimisations. Testing the boundaries of renewable energy-based electricity development in metropolitan France by 2050

    International Nuclear Information System (INIS)

    In 2013, ADEME published its energy and climate scenarios for the period 2030 to 2050, suggesting possible avenues to achieve a four-fold reduction in greenhouse-gas emissions by 2050 by cutting energy consumption by half and deploying renewable energy sources for electricity generation on a substantial scale. Both of these objectives were the basis for targets set by the President of France and subsequently adopted by Parliament in the Energy Transition Law to promote green growth. With this new study, ADEME submits an exploratory scientific prospective study. Questions of balance between production and demand and cost efficiency of renewable-based electricity mixes are investigated through an advanced optimisation. The electricity mixes are theoretical: they are created from scratch and do not take into account the current situation or the path needed to achieve a 100% renewable-based electricity system. It aims at highlighting the technical measures to be implemented (strengthening grids, load shedding and storage) to support a policy of growth in renewable electricity technologies. It is also be used to identify the key factors for developing renewable technologies at lower cost such as lower costs of technologies, demand-side management, development of flexibility, support of R and D of least-mature technologies and the social acceptance of renewable electricity installations. (authors)

  5. Molecular modeling of the human eukaryotic translation initiation factor 5A (eIF5A) based on spectroscopic and computational analyses

    International Nuclear Information System (INIS)

    The eukaryotic translation initiation factor 5A (eIF5A) is a protein ubiquitously present in archaea and eukarya, which undergoes a unique two-step post-translational modification called hypusination. Several studies have shown that hypusination is essential for a variety of functional roles for eIF5A, including cell proliferation and synthesis of proteins involved in cell cycle control. Up to now neither a totally selective inhibitor of hypusination nor an inhibitor capable of directly binding to eIF5A has been reported in the literature. The discovery of such an inhibitor might be achieved by computer-aided drug design based on the 3D structure of the human eIF5A. In this study, we present a molecular model for the human eIF5A protein based on the crystal structure of the eIF5A from Leishmania brasiliensis, and compare the modeled conformation of the loop bearing the hypusination site with circular dichroism data obtained with a synthetic peptide of this loop. Furthermore, analysis of amino acid variability between different human eIF5A isoforms revealed peculiar structural characteristics that are of functional relevance

  6. Multivariate analyses to assess the effects of surgeon and hospital volume on cancer survival rates: a nationwide population-based study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Chun-Ming Chang

    Full Text Available BACKGROUND: Positive results between caseloads and outcomes have been validated in several procedures and cancer treatments. However, there is limited information available on the combined effects of surgeon and hospital caseloads. We used nationwide population-based data to explore the association between surgeon and hospital caseloads and survival rates for major cancers. METHODOLOGY: A total of 11,677 patients with incident cancer diagnosed in 2002 were identified from the Taiwan National Health Insurance Research Database. Survival analysis, the Cox proportional hazards model, and propensity scores were used to assess the relationship between 5-year survival rates and different caseload combinations. RESULTS: Based on the Cox proportional hazard model, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer survival rates, and hazard ratios ranged from 1.3 in head and neck cancer to 1.8 in lung cancer after adjusting for patients' demographic variables, co-morbidities, and treatment modality. When analyzed using the propensity scores, the adjusted 5-year survival rates were poorer for patients treated by low-volume surgeons in low-volume hospitals, compared to those treated by high-volume surgeons in high-volume hospitals (P<0.005. CONCLUSIONS: After adjusting for differences in the case mix, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer 5-year survival rates. Payers may implement quality care improvement in low-volume surgeons.

  7. Pitfalls of voxel-based amyloid PET analyses for diagnosis of Alzheimer's disease. Artifacts due to non-specific uptake in the white matter and the skull

    International Nuclear Information System (INIS)

    Two methods are commonly used in brain image voxel-based analyses widely used for dementia work-ups: 3-dimensional stereotactic surface projections (3D-SSP) and statistical parametric mapping (SPM). The methods calculate the Z-scores of the cortical voxels that represent the significance of differences compared to a database of brain images with normal findings, and visualize them as surface brain maps. The methods are considered useful in amyloid positron emission tomography (PET) analyses to detect small amounts of amyloid-β deposits in early-stage Alzheimer's disease (AD), but are not fully validated. We analyzed the 11C-labeled 2-(2-[2-dimethylaminothiazol-5-yl]ethenyl)-6-(2-[fluoro]ethoxy)benzoxazole (BF-227) amyloid PET imaging of 56 subjects (20 individuals with mild cognitive impairment [MCl], 19 AD patients, and 17 non-demented [ND] volunteers) with 3D-SSP and the easy Z-score imaging system (eZIS) that is an SPM-based method. To clarify these methods' limitations, we visually compared Z-score maps output from the two methods and investigated the causes of discrepancies between them. Discrepancies were found in 27 subjects (9 MCl, 13 AD, and 5 ND). Relatively high white matter uptake was considered to cause higher Z-scores on 3D-SSP in 4 subjects (1 MCl and 3 ND). Meanwhile, in 17 subjects (6 MCl, 9 AD, and 2 ND), Z-score overestimation on eZIS corresponded with high skull uptake and disappeared after removing the skull uptake ('scalping'). Our results suggest that non-specific uptakes in the white matter and skull account for errors in voxel-based amyloid PET analyses. Thus, diagnoses based on 3D-SSP data require checking white matter uptake, and 'scalping' is recommended before eZIS analysis. (author)

  8. Digital differential analysers

    CERN Document Server

    Shilejko, A V; Higinbotham, W

    1964-01-01

    Digital Differential Analysers presents the principles, operations, design, and applications of digital differential analyzers, a machine with the ability to present initial quantities and the possibility of dividing them into separate functional units performing a number of basic mathematical operations. The book discusses the theoretical principles underlying the operation of digital differential analyzers, such as the use of the delta-modulation method and function-generator units. Digital integration methods and the classes of digital differential analyzer designs are also reviewed. The te

  9. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  10. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  11. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  12. Systemdynamisk analyse av vannkraftsystem

    OpenAIRE

    Rydning, Anja

    2007-01-01

    I denne oppgaven er det gjennomført en dynamisk analyse av vannkraftverket Fortun kraftverk. Tre fenomener er særlig vurdert i denne oppgaven: Sjaktsvingninger mellom svingesjakt og magasin, trykkstøt ved turbinen som følge av retardasjonstrykk ved endring i turbinvannføringen og reguleringsstabilitet. Sjaktsvingningene og trykkstøt beregnes analytisk ut fra kontinuitets- og bevegelsesligningen. Modeller av Fortun kraftverk er laget for å beregne trykkstøt og sjaktsvingninger. En modell e...

  13. An example of maintenance program based on a risk analysis; Un exemple de programme de maintenance et etabli sur une analyse des risques

    Energy Technology Data Exchange (ETDEWEB)

    Beringuier, J.P.; Laurens, F.; Notarianni, P.; Reys, P.; Dole, J.M. [Gaz de France (GDF), 75 - Paris (France)

    2000-07-01

    The object of this communication is to present the background of Gaz de France regarding the development, the implementation and the first results of a gas transmission pipelines inspection and rehabilitation program. The development of this program is based on a failure risk analysis. The result of this risk analysis is used to define the order of priority for pipelines inspection. This program allows to allocate maintenance expenses in the most effective way regarding risk reduction, costs and safety goals to be achieved. This program appears to be a modern and flexible tool to promote the safety level of gas transmission pipelines in the eyes of the authorities, the customers and the market regulator. (author)

  14. Mitogenomic analyses of eutherian relationships.

    Science.gov (United States)

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology. PMID:12438776

  15. Biological aerosol warner and analyser

    Science.gov (United States)

    Schlemmer, Harry; Kürbitz, Gunther; Miethe, Peter; Spieweck, Michael

    2006-05-01

    The development of an integrated sensor device BiSAM (Biological Sampling and Analysing Module) is presented which is designed for rapid detection of aerosol or dust particles potentially loaded with biological warfare agents. All functional steps from aerosol collection via immuno analysis to display of results are fully automated. The core component of the sensor device is an ultra sensitive rapid analyser PBA (Portable Benchtop Analyser) based on a 3 dimensional immuno filtration column of large internal area, Poly HRP marker technology and kinetic optical detection. High sensitivity despite of the short measuring time, high chemical stability of the micro column and robustness against interferents make the PBA an ideal tool for fielded sensor devices. It is especially favourable to combine the PBA with a bio collector because virtually no sample preparation is necessary. Overall, the BiSAM device is capable to detect and identify living micro organisms (bacteria, spores, viruses) as well as toxins in a measuring cycle of typically half an hour duration. In each batch up to 12 different tests can be run in parallel together with positive and negative controls to keep the false alarm rate low.

  16. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  17. Analyses of a Virtual World

    CERN Document Server

    Holovatch, Yurij; Szell, Michael; Thurner, Stefan

    2016-01-01

    We present an overview of a series of results obtained from the analysis of human behavior in a virtual environment. We focus on the massive multiplayer online game (MMOG) Pardus which has a worldwide participant base of more than 400,000 registered players. We provide evidence for striking statistical similarities between social structures and human-action dynamics in the real and virtual worlds. In this sense MMOGs provide an extraordinary way for accurate and falsifiable studies of social phenomena. We further discuss possibilities to apply methods and concepts developed in the course of these studies to analyse oral and written narratives.

  18. Some novel insights on HPV16 related cervical cancer pathogenesis based on analyses of LCR methylation, viral load, E7 and E2/E4 expressions.

    Directory of Open Access Journals (Sweden)

    Damayanti Das Ghosh

    Full Text Available This study was undertaken to decipher the interdependent roles of (i methylation within E2 binding site I and II (E2BS-I/II and replication origin (nt 7862 in the long control region (LCR, (ii expression of viral oncogene E7, (iii expression of the transcript (E7-E1/E4 that encodes E2 repressor protein and (iv viral load, in human papillomavirus 16 (HPV16 related cervical cancer (CaCx pathogenesis. The results revealed over-representation (p<0.001 of methylation at nucleotide 58 of E2BS-I among E2-intact CaCx cases compared to E2-disrupted cases. Bisulphite sequencing of LCR revealed overrepresentation of methylation at nucleotide 58 or other CpGs in E2BS-I/II, among E2-intact cases than E2-disrupted cases and lack of methylation at replication origin in case of both. The viral transcript (E7-E1/E4 that produces the repressor E2 was analyzed by APOT (amplification of papillomavirus oncogenic transcript-coupled-quantitative-RT-PCR (of E7 and E4 genes to distinguish episomal (pure or concomitant with integrated from purely integrated viral genomes based on the ratio, E7 C(T/E4 C(T. Relative quantification based on comparative C(T (threshold cycle method revealed 75.087 folds higher E7 mRNA expression in episomal cases over purely integrated cases. Viral load and E2 gene copy numbers were negatively correlated with E7 C(T (p = 0.007 and E2 C(T (p<0.0001, respectively, each normalized with ACTB C(T, among episomal cases only. The k-means clustering analysis considering E7 C(T from APOT-coupled-quantitative-RT-PCR assay, in conjunction with viral load, revealed immense heterogeneity among the HPV16 positive CaCx cases portraying integrated viral genomes. The findings provide novel insights into HPV16 related CaCx pathogenesis and highlight that CaCx cases that harbour episomal HPV16 genomes with intact E2 are likely to be distinct biologically, from the purely integrated viral genomes in terms of host genes and/or pathways involved in cervical

  19. The enhancing power of iodide on corrosion prevention of mild steel in the presence of a synthetic-soluble Schiff-base: Electrochemical and surface analyses

    Energy Technology Data Exchange (ETDEWEB)

    Lashgari, Mohsen, E-mail: Lashgari@iasbs.ac.i [Department of Chemistry, Institute for Advanced Studies in Basic Sciences (IASBS), Gava-Zang Blvd., Zanjan 45137-66731 (Iran, Islamic Republic of); Arshadi, Mohammad-Reza [Department of Chemistry, Sharif University of Technology, PO Box 11365-9516, Tehran (Iran, Islamic Republic of); Miandari, Somaieh [Department of Chemistry, Institute for Advanced Studies in Basic Sciences (IASBS), Gava-Zang Blvd., Zanjan 45137-66731 (Iran, Islamic Republic of)

    2010-08-01

    The inhibitory action of N,N'-1,3-propylen-bis(3-methoxysalicylidenimine) {l_brace}PMSI{r_brace} on mild steel corrosion in sulfuric acid medium was investigated through electrochemical methods and evaluations based on infrared spectroscopy and scanning electron micrographs. The studies revealed that the molecule is a good mixed-type inhibitor (mostly anodic), acts as a multi-dentate ligand and repels the corrosive agents by hydrophobic forces. Its adsorption on metal surface has a physicochemical nature and obeys the Langmuir isotherm. At a critical (optimum) concentration, an anomalous inhibitory behavior was observed and interpreted at microscopic level by means of desorption/adsorption process and horizontal {r_reversible}vertical hypothesis. The addition of iodide into acid moreover causes a synergistic influence, a substantial enhancement on inhibitory performance. Finally, using isolated inhibitor calculations at B3LYP/6-31G + (d,p) level of theory, the equilibrium geometry of PMSI was determined and the energy required for hindrance avoidance was predicted.

  20. The enhancing power of iodide on corrosion prevention of mild steel in the presence of a synthetic-soluble Schiff-base: Electrochemical and surface analyses

    International Nuclear Information System (INIS)

    The inhibitory action of N,N'-1,3-propylen-bis(3-methoxysalicylidenimine) {PMSI} on mild steel corrosion in sulfuric acid medium was investigated through electrochemical methods and evaluations based on infrared spectroscopy and scanning electron micrographs. The studies revealed that the molecule is a good mixed-type inhibitor (mostly anodic), acts as a multi-dentate ligand and repels the corrosive agents by hydrophobic forces. Its adsorption on metal surface has a physicochemical nature and obeys the Langmuir isotherm. At a critical (optimum) concentration, an anomalous inhibitory behavior was observed and interpreted at microscopic level by means of desorption/adsorption process and horizontal ↔vertical hypothesis. The addition of iodide into acid moreover causes a synergistic influence, a substantial enhancement on inhibitory performance. Finally, using isolated inhibitor calculations at B3LYP/6-31G + (d,p) level of theory, the equilibrium geometry of PMSI was determined and the energy required for hindrance avoidance was predicted.

  1. Modelling software failures of digital I and C in probabilistic safety analyses based on the TELEPERM {sup registered} XS operating experience

    Energy Technology Data Exchange (ETDEWEB)

    Jockenhoevel-Barttfeld, Mariana; Taurines Andre [AREVA GmbH, Erlangen (Germany); Baeckstroem, Ola [Lloyd Register Consulting, Stockholm (Sweden); Holmberg, Jan-Erik [Risk Pilot, Espoo (Finland); Porthin, Markus; Tyrvaeinen, Tero [VTT Technical Research Centre of Finland Ltd, Espoo (Finland)

    2015-03-15

    Digital instrumentation and control (I and C) systems appear as upgrades in existing nuclear power plants (NPPs) and in new plant designs. In order to assess the impact of digital system failures, quantifiable reliability models are needed along with data for digital systems that are compatible with existing probabilistic safety assessments (PSA). The paper focuses on the modelling of software failures of digital I and C systems in probabilistic assessments. An analysis of software faults, failures and effects is presented to derive relevant failure modes of system and application software for the PSA. The estimations of software failure probabilities are based on an analysis of the operating experience of TELEPERM {sup registered} XS (TXS). For the assessment of application software failures the analysis combines the use of the TXS operating experience at an application function level combined with conservative engineering judgments. Failure probabilities to actuate on demand and of spurious actuation of typical reactor protection application are estimated. Moreover, the paper gives guidelines for the modelling of software failures in the PSA. The strategy presented in this paper is generic and can be applied to different software platforms and their applications.

  2. Elaboration, analyse et modélisation mécanique numérique d’agro-composites à base de fibres courtes d’alfa

    Directory of Open Access Journals (Sweden)

    El-Abbassi Fatima-ezzahra

    2014-04-01

    Full Text Available Dans la présente étude, récemment réalisée au Laboratoire d’Ingénierie et Sciences des Matériaux, de l’Université de Reims Champagne-Ardenne, nous proposons de valoriser la plante végétale d’Alfa, sous forme de fibres courtes en vue de renforcer par extrusion dans une première étape, puis par injection, des matrices polymères, en l’occurrence le Polypropylène (PP. Une première étape consiste à élaborer, par traitement chimique et extraction, la fibre courte à utiliser comme renfort. Il s’en suit une extrusion de courte à utiliser comme renfort. Il s’en suit une extrusion de composés composites à base de matrice PP. L’objectif sera de développer par la suite une démarche Essai-Calcul, confrontant une caractérisation mécanique d’agrocomposites PP-Alfa injectés à deux types de modélisation comportementale, l’une micromécanique basée sur le modèle de Mori-Tanaka, et l’autre numérique utilisant une approche originale basée sur une technique dite de fibre projetée.

  3. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop a optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. (authors)

  4. A new system for parallel drug screening against multiple-resistant HIV mutants based on lentiviral self-inactivating (SIN vectors and multi-colour analyses

    Directory of Open Access Journals (Sweden)

    Prokofjeva Maria M

    2013-01-01

    Full Text Available Abstract Background Despite progress in the development of combined antiretroviral therapies (cART, HIV infection remains a significant challenge for human health. Current problems of cART include multi-drug-resistant virus variants, long-term toxicity and enormous treatment costs. Therefore, the identification of novel effective drugs is urgently needed. Methods We developed a straightforward screening approach for simultaneously evaluating the sensitivity of multiple HIV gag-pol mutants to antiviral drugs in one assay. Our technique is based on multi-colour lentiviral self-inactivating (SIN LeGO vector technology. Results We demonstrated the successful use of this approach for screening compounds against up to four HIV gag-pol variants (wild-type and three mutants simultaneously. Importantly, the technique was adapted to Biosafety Level 1 conditions by utilising ecotropic pseudotypes. This allowed upscaling to a large-scale screening protocol exploited by pharmaceutical companies in a successful proof-of-concept experiment. Conclusions The technology developed here facilitates fast screening for anti-HIV activity of individual agents from large compound libraries. Although drugs targeting gag-pol variants were used here, our approach permits screening compounds that target several different, key cellular and viral functions of the HIV life-cycle. The modular principle of the method also allows the easy exchange of various mutations in HIV sequences. In conclusion, the methodology presented here provides a valuable new approach for the identification of novel anti-HIV drugs.

  5. Inhibition of precipitation of carbonate apatite by trisodium citrate analysed in base of the formation of chemical complexes in growth solution

    Science.gov (United States)

    Prywer, Jolanta; Olszynski, Marcin; Mielniczek-Brzóska, Ewa

    2015-11-01

    Effect of trisodium citrate on the precipitation of carbonate apatite is studied. The experimental series are performed in the solution of artificial urine. The investigations are related to infectious urinary stones formation as carbonate apatite is one of the main components of this kind of stones. To mimic a real infection in urinary tract the aqueous ammonia solution was added to the solution of artificial urine. The spectrophotometric results demonstrate that trisodium citrate increases induction time with respect to carbonate apatite formation and decreases the efficiency of carbonate apatite precipitation. The inhibitory effect of trisodium citrate on the precipitation of carbonate apatite is explained in base of chemical speciation analysis. Such an analysis demonstrates that the inhibitory effect is mainly related with the fact that trisodium citrate binds Ca2+ ions and causes the formation of CaCit- and Ca10(PO4)6CO3 complexes. Trisodium citrate binds Ca2+ ions in the range of pH from 6 to 9.5 for which carbonate apatite is favored to be formed.

  6. Mid-Adolescent Predictors of Adult Drinking Levels in Early Adulthood and Gender Differences: Longitudinal Analyses Based on the South Australian School Leavers Study

    Directory of Open Access Journals (Sweden)

    Paul H. Delfabbro

    2016-01-01

    Full Text Available There is considerable public health interest in understanding what factors during adolescence predict longer-term drinking patterns in adulthood. The aim of this study was to examine gender differences in the age 15 social and psychological predictors of less healthy drinking patterns in early adulthood. The study investigates the relative importance of internalising problems, other risky health behaviours, and peer relationships after controlling for family background characteristics. A sample of 812 young people who provided complete alcohol consumption data from the age of 15 to 20 years (5 measurement points were drawn from South Australian secondary schools and given a detailed survey concerning their psychological and social wellbeing. Respondents were classified into two groups based upon a percentile division: those who drank at levels consistently below NHMRC guidelines and those who consistently drank at higher levels. The results showed that poorer age 15 scores on measures of psychological wellbeing including scores on the GHQ-12, self-esteem, and life-satisfaction as well as engagement in health-related behaviours such as smoking or drug-taking were associated with higher drinking levels in early adulthood. The pattern of results was generally similar for both genders. Higher drinking levels were most strongly associated with smoking and marijuana use and poorer psychological wellbeing during adolescence.

  7. Evaluation of metal and radionuclide data from neutron activation and acid-digestion-based spectrometry analyses of background soils. Significance in environmental restoration

    International Nuclear Information System (INIS)

    A faster, more cost-effective, and higher-quality data acquisition for natural background-level metals and radionuclides in soils is needed for remedial investigations of contaminated sites. The advantages and disadvantages of neutron activation analysis (NAA) compared with those of acid-digestion-based spectrometry (ADS) methods were evaluated using Al, Sb, As, Cr, Co, Fe, Mg, Mn, Hg, K, Ag, 232Th, 235U, 238U, V, and Zn data. The ADS methods used for this project were inductively coupled plasma (ICP), ICP-mass spectrometry (ICP-MS), and alpha spectrometry. Scatter plots showed that the NAA results for As, Co, Fe, Mn, 232Th, and 238U are reasonably correlated with the results from the other analytical methods. Compared to NAA, however, the ADS methods underestimated Al, Cr, Mg, K, V, and Zn. Because of the high detection limits of ADS methods, the scatter plots of Sb, Hg, and Ag did not show a definite relationship. The NAA results were highly correlated with the alpha spectrometry results for 232Th and 238U but poorly correlated for 235U. The NAA, including the delayed neutron counting, was a far superior technique for quantifying background levels of radionuclides (232Th, 235U, and 238U) and metals (Al, Cr, Mg, K, V, and Zn) in soils. (author)

  8. European UV DataBase (EUVDB) as a repository and quality analyser for solar spectral UV irradiance monitored in Sodankylä

    Science.gov (United States)

    Heikkilä, Anu; Kaurola, Jussi; Lakkala, Kaisa; Matti Karhu, Juha; Kyrö, Esko; Koskela, Tapani; Engelsen, Ola; Slaper, Harry; Seckmeyer, Gunther

    2016-08-01

    Databases gathering atmospheric data have great potential not only as data storages but also in serving as platforms for coherent quality assurance (QA). We report on the flagging system and QA tools designed for and implemented in the European UV DataBase (EUVDB; http://uv.fmi.fi/uvdb/" target="_blank">http://uv.fmi.fi/uvdb/) for measured data on solar spectral UV irradiance. We confine the study on the data measured by Brewer #037 MkII spectroradiometer in Sodankylä (67.37° N, 26.63° E) in 1990-2014. The quality indicators associated with the UV irradiance spectra uploaded into the database are retrieved from the database and subjected to a statistical analysis. The study demonstrates the performance of the QA tools of the EUVDB. In addition, it yields an overall view of the availability and quality of the solar UV spectra recorded in Sodankylä over a quarter of a century. Over 90 % of the four main quality indicators are flagged as GREEN, indicating the highest achievable quality. For the BLACK flags, denoting data not meeting the pre-defined requirements, the percentages for all the indicators remain below 0.12 %.

  9. Distinct summer and winter bacterial communities in the active layer of Svalbard permafrost revealed by DNA- and RNA-based analyses

    DEFF Research Database (Denmark)

    Schostag, Morten; Stibal, Marek; Jacobsen, Carsten S.; Bælum, Jacob; Tas, Neslihan; Elberling, Bo; Jansson, Janet K.; Semenchuk, Philipp; Prieme, Anders

    2015-01-01

    The active layer of soil overlaying permafrost in the Arctic is subjected to dramatic annual changes in temperature and soil chemistry, which likely affect bacterial activity and community structure. We studied seasonal variations in the bacterial community of active layer soil from Svalbard (78º...... phototrophic organisms (up to 48% in June) illustrating the potential importance of nitrogen fixation in otherwise nitrogen poor Arctic ecosystems and of phototrophic bacterial activity on the soil surface.......N) by co-extracting DNA and RNA from 12 soil cores collected monthly over a year. PCR amplicons of 16S rRNA genes (DNA) and reverse transcribed transcripts (cDNA) were quantified and sequenced to test for the effect of low winter temperature and seasonal variation in concentration of easily degradable...... significantly correlated with pH and dissolved organic carbon, while the potentially active (RNA-based) community structure was not significantly correlated with any of the measured soil parameters. A large fraction of the 16S rRNA transcripts was assigned to nitrogen-fixing bacteria (up to 24% in June) and...

  10. Rasch-family models are more valuable than score-based approaches for analysing longitudinal patient-reported outcomes with missing data.

    Science.gov (United States)

    de Bock, Elodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Etienne; Sébille, Véronique

    2013-12-16

    The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. PMID:24346165

  11. The R Protein of SARS-CoV: Analyses of Structure and Function Based on Four Complete Genome Sequences of Isolates BJ01-BJ04

    Institute of Scientific and Technical Information of China (English)

    Zuyuan Xu; Zizhang Zhang; Jing Xu; Wei Wei; Jingui Zhu; Haiyan Sun; Xiaowei Zhang; Jun Zhou; Songgang Li; Jun Wang; Jian Wang; Haiqing Zhang; Shengli Bi; Huanming Yang; Xiangjun Tian; Jia Ji; Wei Li; Yan Li; Wei Tian; Yujun Han; Lili Wang

    2003-01-01

    The R (replicase) protein is the uniquely defined non-structural protein (NSP)responsible for RNA replication, mutation rate or fidelity, regulation of transcrip-tion in coronaviruses and many other ssRNA viruses. Based on our completegenome sequences of four isolates (BJ01-BJ04) of SARS-CoV from Beijing, China,we analyzed the structure and predicted functions of the R protein in comparisonwith 13 other isolates of SARS-CoV and 6 other coronaviruses. The entire ORF(open-reading frame) encodes for two major enzyme activities, RNA-dependentRNA polymerase (RdRp) and proteinase activities. The R polyprotein under-goes a complex proteolytic process to produce 15 function-related peptides. Ahydrophobic domain (HOD) and a hydrophilic domain (HID) are newly identifiedwithin NSP1. The substitution rate of the R protein is close to the average ofthe SARS-CoV genome. The functional domains in all NSPs of the R proteingive different phylogenetic results that suggest their different mutation rate underselective pressure. Eleven highly conserved regions in RdRp and twelve cleavagesites by 3CLP (chymotrypsin-like protein) have been identified as potential drugtargets. Findings suggest that it is possible to obtain information about the phy-logeny of SARS-CoV, as well as potential tools for drug design, genotyping anddiagnostics of SARS.

  12. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop an optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. For instance, by decarbonizing half of its generation mix, Saudi Arabia can release around 0.5 Mb/d barrels of oil equivalent per day from 2020. Moreover, total power generation cost reduction can reach up to around 28% per year from 2030 if Saudi Arabia manages to attain the most optimal generation mix structure introduced in the model (50% of power from renewables and nuclear power plants and 50% from the fossil power plants). - Highlights: • We model the current and future power generation situation of Saudi Arabia. • We take into account the integration of the other alternative resources. • We consider different scenarios of power generation structure for the country. • Optimal generation mix can release considerable amount of oil for export

  13. SPIDIA-RNA: second external quality assessment for the pre-analytical phase of blood samples used for RNA based analyses.

    Directory of Open Access Journals (Sweden)

    Francesca Malentacchi

    Full Text Available One purpose of the EC funded project, SPIDIA, is to develop evidence-based quality guidelines for the pre-analytical handling of blood samples for RNA molecular testing. To this end, two pan-European External Quality Assessments (EQAs were implemented. Here we report the results of the second SPIDIA-RNA EQA. This second study included modifications in the protocol related to the blood collection process, the shipping conditions and pre-analytical specimen handling for participants. Participating laboratories received two identical proficiency blood specimens collected in tubes with or without an RNA stabilizer. For pre-defined specimen storage times and temperatures, laboratories were asked to perform RNA extraction from whole blood according to their usual procedure and to return extracted RNA to the SPIDIA facility for further analysis. These RNA samples were evaluated for purity, yield, integrity, stability, presence of interfering substances, and gene expression levels for the validated markers of RNA stability: FOS, IL1B, IL8, GAPDH, FOSB and TNFRSF10c. Analysis of the gene expression results of FOS, IL8, FOSB, and TNFRSF10c, however, indicated that the levels of these transcripts were significantly affected by blood collection tube type and storage temperature. These results demonstrated that only blood collection tubes containing a cellular RNA stabilizer allowed reliable gene expression analysis within 48 h from blood collection for all the genes investigated. The results of these two EQAs have been proposed for use in the development of a Technical Specification by the European Committee for Standardization.

  14. Environmental impacts of renewable energy. Geographic Information Systems (GIS) based analysis of cumulative effects; Umweltauswirkungen erneuerbarer Energien. GIS-gestuetzte Analyse kumulativer Wirkungen

    Energy Technology Data Exchange (ETDEWEB)

    Rhoden, Henning

    2015-04-15

    The energy transition and thus turning away from fossil fuels and nuclear energy sources is based on an increased expansion of renewable energies. This expansion mainly take place in nature and the landscape, which conflicts with the objectives of the Federal Nature Conservation Act concerning scenery or the consequences of monoculture cultivation of energy crops. What happens, however, if more than one type of renewables occur compressed in a landscape that is investigated in this work. Result from cumulative effects are extended conflict with the objectives of the Federal Nature Conservation Act or possibly have positive effects can be seen? A ''cumulative effect'' is defined as an additive-synergistic overall effect of all a protected interest of respective impact factors. These arise from one or more projects / plans and influence from a variety of ways. As part of the investigations carried out it is clear that extended conflicts may arise in relation to the objectives of the Federal Nature Conservation Act by cumulative effects of renewable energies. To prevent these conflicts, policies and regulations in the context of spatial planning is necessary to enable a focusing of spatial planning for a sustainable expansion of renewable energy. [German] Die Energiewende und damit die Abkehr von fossilen und atomaren Energiequellen beruht auf einem verstaerkten Ausbau der erneuerbaren Energien. Dieser Ausbau findet vorwiegend in Natur und Landschaft statt, wobei Konflikte mit den Zielen des BNatSchG z.B. hinsichtlich Landschaftsbild oder den Folgen von Monokultur beim Energiepflanzenanbau bereits gegeben sind. Was jedoch passiert, wenn mehrere Arten erneuerbarer Energien in einer Landschaft komprimiert auftreten, wird in dieser Arbeit untersucht. Ergeben sich aus kumulierten Wirkungen erweiterte Konflikte mit den Zielen des BNatSchG oder sind moeglicherweise positive Effekte zu erkennen? Eine ''kumulative Wirkung'' ist

  15. APROS nuclear plant analyser

    International Nuclear Information System (INIS)

    The paper describes the build-up of the Loviisa plant primary circuit model using graphical user interface and generic components. The secondary circuit model of Loviisa is constructed in the same manner. The entire power plant model thus obtained is used for the calculation of two example transients. These examples originate from the Loviisa 2 unit dynamical tests in 1980. The Modular Plant Analyser results are compared with the Loviisa Unit 2 measurement data. This comparison indicates good agreement with the data. The present work has been performed using the Alliant FX/40 minisupercomputer. With this computer the Loviisa model fulfills at present the real-time requirement with 0.5 second timestep. (orig./DG)

  16. Investment Projects Analyse Methodology

    OpenAIRE

    Simen Antoneta

    2012-01-01

    The IFIs analyze methodology of investment projects based on dynamic methods of assessment and financial analysis in which the predominant component is to update the flow of revenues and expenses. IFIs have adopted the method of analysis based on internal rate of return (financial and economic), considered the most appropriate because it allows comparison with the appropriate cost of capital, the reference standard for financial analysis in market economies. The expression of projects efficie...

  17. Analyse spatiale et statistique de l’âge du Fer en France. L’exemple de la “ BaseFer ” Spatial and statistical analysis of the Iron Age in France. The example of 'basefer'

    Directory of Open Access Journals (Sweden)

    Olivier Buchsenschutz

    2009-05-01

    Full Text Available Le développement des systèmes d'information géographique (SIG permet d'introduire dans les bases de données archéologiques la localisation des données. Il est possible alors d'obtenir des cartes de répartition qu'il s'agit ensuite d'interpréter en s’appuyant sur des analyses statistiques et spatiales. Cartes et statistiques mettent en évidence l'état de la recherche, les conditions de conservation des sites, et au-delà des phénomènes historiques ou culturels.À travers un programme de recherche sur l'âge du Fer en France (Basefer une base de données globale a été constituée pour l'espace métropolitain. Cet article propose un certain nombre d'analyses sur les critères descriptifs généraux d’un corpus de 11 000 sites (les départements côtiers de la Méditerranée ne sont pas traités dans ce test. Le contrôle et le développement des rubriques plus fines seront réalisés avec une équipe élargie, avant une mise en réseau de la base.The development of Geographical Information Systems (GIS allows information in archaeological databases to be georeferenced. It is thus possible to obtain distribution maps which can then be interpreted using statistical and spatial analyses. Maps and statistics highlight the state of research, the condition of sites, and moreover historical and cultural phenomena.Through a research programme on the Iron Age in France (Basefer, a global database was established for the entire country. This article puts forward some analyses of the general descriptive criteria represented in a corpus of 11000 sites (departments along the Mediterranean Sea coast are excluded from this test. The control and development of finer descriptors will be undertaken by an enlarged team, before the data are networked.

  18. 一种基于离散对数的强代理签名方案的分析与改进%ANALYSING A DISCRETE LOGARITHM-BASED STRONG PROXY SIGNATURE SCHEME AND ITS IMPROVEMENT

    Institute of Scientific and Technical Information of China (English)

    张兴华

    2014-01-01

    In this paper we analyse in detail the existing discrete logarithm problem-based strong proxy signature schemes,and find that these schemes have the defects of not being able to resist the public key substitution attack,and we provide the attacking methods as well. Based on the difficulty of discrete logarithm and Schnorr system,we present a new strong proxy signature scheme by improving the signature algorithm.We elaborately analyse the new scheme in its capabilities of resisting the public key substitution attack and limiting the range of proxy signature and the signature time.The new scheme has stronger practicability and security.%详细分析现有的基于离散对数问题的强代理签名方案,发现方案存在不能抵抗公钥替换攻击的缺陷,并给出攻击方法。基于离散对数的困难性和Schnorr体制,通过签名算法的改进,给出一种新的强代理签名方案。重点分析新方案可以抵抗公钥替换攻击,可以对代理签名的范围和签名时间进行限制等。该方案的实用性及安全性更强。

  19. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses; Effets de l'age et du genre sur la perfusion cerebrale regionale etudiee par deux methodes d'analyse statistique voxel-par-voxel

    Energy Technology Data Exchange (ETDEWEB)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T. [Universite Catholique de Louvain, Service de Medecine Nucleaire, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); Van Laere, K. [Leuven Univ. Hospital, Nuclear Medicine Div. (Belgium); Jamart, J. [Universite Catholique de Louvain, Dept. de Biostatistiques, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); D' Asseler, Y. [Ghent Univ., Medical Signal and Image Processing Dept. (MEDISIP), Faculty of applied sciences (Belgium); Minoshima, S. [Washington Univ., Dept. of Radiology, Seattle (United States)

    2009-10-15

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine {sup 99m}Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  20. The psychological status of phonological analyses

    Directory of Open Access Journals (Sweden)

    David Eddington

    2015-09-01

    Full Text Available This paper casts doubt on the psychological relevance of many phonological analyses. There are four reasons for this: 1 theoretical adequacy does not necessarily imply psychological significance; 2 most approaches are nonempirical in that they are not subject to potential spatiotemporal falsification; 3 phonological analyses are estab­ lished with little or no recourse to the speakers of the language via experimental psy­ chology; 4 the limited base of evidence which most analyses are founded on is further cause for skepticism.

  1. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    for NPP V-1 Bohunice and on review of the impact of the modelling of selected components to the results of calculation safety analysis (a sensitivity study for NPP Mochovce). In 2001 UJD joined a new European project Alternative Approaches to the Safety Performance Indicators. The project is aimed at the information collecting and determining of approaches and recommendations for implementation of the risk oriented indicators, identification of the impact of the safety culture level and organisational culture on safety and applying of indicators to the needs of regulators and operators. In frame of the PHARE project UJD participated in the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213 units. The main results of the analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated within the project. Possible severe accident mitigation and preventative measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. The obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project which makes a part of the 5th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2001 the ASTEC computer code was tested on different platforms. The results of the testing are summarised in the technical report of EC issued in September 2001. Further activities within this project were focused on performing of selected accident scenarios analyses and comparison of the obtained results with the analyses realised with the help of other computer codes. The work on the project will continue in 2002. In 2001 a groundwork on establishing the Centre for Nuclear Safety in Central and Eastern Europe (CENS), the seat of which is going to be in Bratislava, has continued. The

  2. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    environment. In addition, main input data are based on transport modelling analyses based on a misleading `local ontology' among the model makers. The ontological misconceptions translate into erroneous epistemological assumptions about the possibility of precise predictions and the validity of willingness...

  3. Functional Analyses and Treatment of Precursor Behavior

    OpenAIRE

    Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding fo...

  4. Computational Analyses of Arabic Morphology

    CERN Document Server

    Kiraz, G A

    1994-01-01

    This paper demonstrates how a (multi-tape) two-level formalism can be used to write two-level grammars for Arabic non-linear morphology using a high level, but computationally tractable, notation. Three illustrative grammars are provided based on CV-, moraic- and affixational analyses. These are complemented by a proposal for handling the hitherto computationally untreated problem of the broken plural. It will be shown that the best grammars for describing Arabic non-linear morphology are moraic in the case of templatic stems, and affixational in the case of a-templatic stems. The paper will demonstrate how the broken plural can be derived under two-level theory via the `implicit' derivation of the singular.

  5. Analysis of climatically relevant processes in the troposphere using ground-based remote measuring methods (windprofiler/RASS). Final report; Analyse klimatisch relevanter Prozesse in der Troposphaere mit Hilfe bodengebundener Fernerkundungsmethoden (Windprofiler/RASS). Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Steinhagen, H.; Christoph, A.; Engelbart, D.; Goersdorf, U.; Hirsch, L.; Lippmann, J.; Neisser, J.; Wergen, W.

    1995-09-01

    In the framework of the present research project the Meterological Observatory of Lindenberg (MOL) was equipped with the scientific and technical means necessary for the future operational application at the German weather service of ground-based remote sounding technologies such as `windprofiler radar`, radio-acoustic sounding system (RASS). Several case studies were used to demonstrate the multifarious possibilities of analysing mesoscale tropospheric structures by means of windprofiler radar and RASS. Besides this, further information such as mixing layer thickness and heat flux were derived from windprofiler and RASS measurements and the applied algorithms were tried on case examples. (orig./AKF) [Deutsch] Im Rahmen dieses Forschungsvorhabens sind am Meteorologischen Observatorium Lindenberg (MOL) wissenschaftliche und technische Voraussetzungen fuer eine zukuenftige operationelle Anwendung aktiver bodengebundener Fernsondierungstechnologien, wie `Windprofiler-Radar` und `Radio-Akustisches-Sondierungs-System (RASS)` im Deutschen Wetterdienst geschaffen worden. An Hand mehrerer Fallstudien wurden die vielfaeltigen Moeglichkeiten zur Analyse mesoskaliger troposphaerischer Strukturen mit Windprofiler-Radar und RASS demonstriert. Darueber hinaus wurden aus Windprofiler-/RASS-Messungen weiterfuehrende Informationen, wie Mischungsschichthoehe und Waermefluss abgeleitet und die entsprechenden Algorithmen am Fallbeispielen erprobt. (orig./AKF)

  6. Cost-effectiveness and cost-utility analyses of hospital-based home care compared to hospital-based care for children diagnosed with type 1 diabetes; a randomised controlled trial; results after two years’ follow-up

    OpenAIRE

    Tiberg, Irén; Lindgren, Björn; Carlsson, Annelie; Hallström, Inger

    2016-01-01

    Background Practices regarding hospitalisation of children at diagnosis of type 1 diabetes vary both within countries and internationally, and high-quality evidence of best practice is scarce. The objective of this study was to close some of the gaps in evidence by comparing two alternative regimens for children diagnosed with type 1 diabetes: hospital-based care and hospital-based home care (HBHC), referring to specialist care in a home-based setting. Methods A randomised controlled trial, i...

  7. Ultrasensitive flow cytometric analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jett, J.H.; Cram, L.S.; Keller, R.A.; Martin, J.C.; Saunders, G.C.; Sklar, L.A.; Steinkamp, J.A.

    1993-01-01

    New techniques and approaches to cellular analysis being developed at the Los Alamos National Flow Cytometry Resource can be divided into those that improve sensitivity and those that move the technology into new areas by refining existing approaches. An example of the first category is a flow cytometric system capable of measuring the phase shift of fluorescence emitted by fluorophors bound to cells is being assembled. This phase sensitive cytometer is be capable of quantifying fluorescence life time on a cell-by-cell basis as well as using the phase sensitive detection to separate fluorescence emissions that overlap spectrally but have different lifetimes. A Fourier transform flow cytometer capable of measuring the fluorescence emission spectrum of individual labeled cells at rates approaching several hundred per second is also in the new technology category. The current implementation is capable of resolving the visible region of the spectrum into 8 bands. With this instrument, it is possible to resolve the contributions of fluorophors with overlapping emission spectra and to determine the emission spectra of dyes such as calcium concentration indicators that are sensitive to the physiological environment. Flow cytometric techniques have been refined to the point that it is possible to detect individual fluorescent molecules in solution as they flow past a laser beam. This capability has lead to a rapid DNA sequencing project. The goal of the project is to develop a technique that is capable of sequencing long strands of DNA (40,000 kb) at a rate of between 100 and 1,000 bases per second.

  8. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    have been made observable and reproducible within a physical and a distinct element numerical modelling environment (DEM). As link between field evidence gained from the deposits of natural sturzstroms, the physical model within the ETH Geotechnical Drum Centrifuge (Springman et al., 2001) and the numerical model PFC-3D (Cundall and Strack, 1979; Itasca, 2005), serves a deterministic fractal analytical comminution model (Sammis et al., 1987; Steacy and Sammis, 1991). This approach allowed studying the effects of dynamic fragmentation within sturzstroms at true (macro) scale within the distinct element model, by allowing for a micro-mechanical, distinct particle based, and cyclic description of fragmentation at the same time, without losing significant computational efficiency. Theses experiments indicate rock mass and boundary conditions, which allow an alternating fragmenting and dilating dispersive regime to evolve and to be sustained long enough to replicate the spreading and run out of sturzstroms. The fragmenting spreading model supported here is able to explain the run out of a dry granular flow, beyond the travel distance predicted by a Coulomb frictional sliding model, without resorting to explanations by mechanics that can only be valid for certain, specific of the boundary conditions. The implications derived suggest that a sturzstrom, because of its strong relation to internal fractal fragmentation and other inertial effects, constitutes a landslide category of its own. Its mechanics differ significantly from all other gravity driven mass flows. This proposition does not exclude the possible appearance of frictionites, Toma hills or suspension flows etc., but it considers them as secondary features. The application of a fractal comminution model to describe natural and experimental sturzstrom deposits turned out to be a useful tool for sturzstrom research. Implemented within the DEM, it allows simulating the key features of sturzstrom successfully and

  9. ANALYSING AND EVALUATING I/O PERFORMANCE OF OBJECT-BASED FILE SYSTEMS%基于对象的文件系统I/O分析与评测

    Institute of Scientific and Technical Information of China (English)

    胡永奎; 杜祝平; 方圆

    2011-01-01

    Currently, most of the object-based storage devices adopt universal file system such as EXT2 and EXT3 to manage, but are facing the problem of effectual storage and management against massive data. The object-based file systems realise to organise the objects in the disk area according to their size,and introduce the mechanisms of Hash and WorstFit, which improves the utility rate of the disk and observably raises its throughput rate at the same time. In this paper we test and analyse the L/O performances of EXT2, EXT3 and object-based file systems. Experimental results show that the I/O performance of the object-based file systems is two or three times of that of the EXT2 and EXT3.%目前基于对象的存储设备中多采用通用的文件系统如EXT2、EXT3等进行管理,但面临着对海量数据的高效存储、管理问题.基于对象的文件系统实现了对象按大小在磁盘区域中组织管理,并采用了Hash、最坏适应等机制,提高磁盘利用率的同时显著提升了磁盘吞吐率.对EXT2、EXT3、基于对象的文件系统进行了L/O性能测试及分析.实验表明,基于对象的文件系统的I/O性能是EXT2、EXT3文件系统的2到3倍.

  10. A gamma model for {DNA} mixture analyses

    OpenAIRE

    Cowell, R. G.; Lauritzen, S L; Mortera, J.

    2007-01-01

    We present a new methodology for analysing forensic identification problems involving DNA mixture traces where several individuals may have contributed to the trace. The model used for identification and separation of DNA mixtures is based on a gamma distribution for peak area values. In this paper we illustrate the gamma model and apply it on several real examples from forensic casework.

  11. Comparing functional annotation analyses with Catmap

    Directory of Open Access Journals (Sweden)

    Krogh Morten

    2004-12-01

    Full Text Available Abstract Background Ranked gene lists from microarray experiments are usually analysed by assigning significance to predefined gene categories, e.g., based on functional annotations. Tools performing such analyses are often restricted to a category score based on a cutoff in the ranked list and a significance calculation based on random gene permutations as null hypothesis. Results We analysed three publicly available data sets, in each of which samples were divided in two classes and genes ranked according to their correlation to class labels. We developed a program, Catmap (available for download at http://bioinfo.thep.lu.se/Catmap, to compare different scores and null hypotheses in gene category analysis, using Gene Ontology annotations for category definition. When a cutoff-based score was used, results depended strongly on the choice of cutoff, introducing an arbitrariness in the analysis. Comparing results using random gene permutations and random sample permutations, respectively, we found that the assigned significance of a category depended strongly on the choice of null hypothesis. Compared to sample label permutations, gene permutations gave much smaller p-values for large categories with many coexpressed genes. Conclusions In gene category analyses of ranked gene lists, a cutoff independent score is preferable. The choice of null hypothesis is very important; random gene permutations does not work well as an approximation to sample label permutations.

  12. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming;

    2015-01-01

    and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  13. Analyse

    DEFF Research Database (Denmark)

    Dubgaard, Alex

    2009-01-01

    Restriktioner over for landbruget er en god forretning. Til gengæld kan det ikke betale sig at reducere udledningen af drivhusgasser......Restriktioner over for landbruget er en god forretning. Til gengæld kan det ikke betale sig at reducere udledningen af drivhusgasser...

  14. Protocruzia,a highly ambiguous ciliate(Protozoa;Ciliophora):Very likely an ancestral form for Heterotrichea,Colpodea or Spirotrichea? With reevaluation of its evolutionary position based on multigene analyses

    Institute of Scientific and Technical Information of China (English)

    Thorsten; STOECK; Shin; Mann; Kyoon; Al-Rasheid; Khaled; A.; S.; Al-Khedhairy; Abdulaziz; A.

    2010-01-01

    The ciliate genus Protocruzia belongs to one of the most ambiguous taxa considering its systematic position,possibly as a member of the classes Heterotrichea,Spirotrichea or Karyorelictea,which is tentatively placed into Spirotrichea in Lynn’s 2008 system.To test these hypotheses,multigene trees(Bayesian inference,evolutionary distance,maximum parsimony,and maximum likelihood) were constructed using the small subunit rRNA(SSU rRNA) gene,internal transcribed spacer 2(ITS2) and a protein coding gene(histone H4).All analyses agree that:(1) four morphotypes of Protocruzia from different geographical origins group together and form a monophyletic clade,which cannot be assigned to any of the eleven described ciliate classes;(2) it is invariably positioned on an isolated branch separated from the class Spirotrichea suggesting that this clade should be clearly removed from Spirotrichea;(3) this leads us to hypothesize that this taxon may indeed represent a lineage on a class rank.Based on the fact that it is,both morphologically and in molecular features,closely related to the heterotrichs,Colpodea and Oligohymenophorea,Protocruziida might be an ancestral form for the subphylum Intramacronucleata in the evolutionary line from the class Heterotrichea(subphylum Postciliodesmatophora) to higher taxa.

  15. The application analyses for primary spectrum pyrometer

    Institute of Scientific and Technical Information of China (English)

    FU; TaiRan

    2007-01-01

    In the applications of primary spectrum pyrometry, based on the dynamic range and the minimum sensibility of the sensor, the application issues, such as the measurement range and the measurement partition, were investigated through theoretical analyses. For a developed primary spectrum pyrometer, the theoretical predictions of measurement range and the distributions of measurement partition were presented through numerical simulations. And the measurement experiments of high-temperature blackbody and standard temperature lamp were processed to further verify the above theoretical analyses and numerical results. Therefore the research in the paper provides the helpful supports for the applications of primary spectrum pyrometer and other radiation pyrometers.……

  16. FAME: Software for analysing rock microstructures

    Science.gov (United States)

    Hammes, Daniel M.; Peternell, Mark

    2016-05-01

    Determination of rock microstructures leads to a better understanding of the formation and deformation of polycrystalline solids. Here, we present FAME (Fabric Analyser based Microstructure Evaluation), an easy-to-use MATLAB®-based software for processing datasets recorded by an automated fabric analyser microscope. FAME is provided as a MATLAB®-independent Windows® executable with an intuitive graphical user interface. Raw data from the fabric analyser microscope can be automatically loaded, filtered and cropped before analysis. Accurate and efficient rock microstructure analysis is based on an advanced user-controlled grain labelling algorithm. The preview and testing environments simplify the determination of appropriate analysis parameters. Various statistic and plotting tools allow a graphical visualisation of the results such as grain size, shape, c-axis orientation and misorientation. The FAME2elle algorithm exports fabric analyser data to an elle (modelling software)-supported format. FAME supports batch processing for multiple thin section analysis or large datasets that are generated for example during 2D in-situ deformation experiments. The use and versatility of FAME is demonstrated on quartz and deuterium ice samples.

  17. Fouling analyses of heat exchangers for PSR

    International Nuclear Information System (INIS)

    Fouling of heat exchangers is generated by water-borne deposits, commonly known as foulants including particulate matter from the air, migrated corrosion produces; silt, clays, and sand suspended in water; organic contaminants; and boron based deposits in plants. This fouling is known to interfere with normal flow characteristics and reduce thermal efficiencies of heat exchangers. This paper focuses on fouling analyses for six heat exchangers of two primary systems in two nuclear power plants; the regenerative heat exchangers of the chemical and volume control system and the component cooling water heat exchangers of the component cooling water system. To analyze the fouling for heat exchangers, fouling factor was introduced based on the ASME O and M codes and TEMA standards. Based on the results of the fouling analyses, the present thermal performances and fouling levels for the six heat exchangers were predicted

  18. Conjoint-Analyse und Marktsegmentierung

    OpenAIRE

    Steiner, Winfried J.; Baumgartner, Bernhard

    2003-01-01

    Die Marktsegmentierung zählt neben der Neuproduktplanung und Preisgestaltung zu den wesentlichen Einsatzgebieten der Conjoint-Analyse. Neben traditionell eingesetzten zweistufigen Vorgehensweisen, bei denen Conjoint-Analyse und Segmentierung in zwei getrennten Schritten erfolgen, stehen heute mit Methoden wie der Clusterwise Regression oder Mixture-Modellen neuere Entwicklungen, die eine simultane Segmentierung und Präferenzschätzung ermöglichen, zur Verfügung. Der Beitrag gibt einen Überblic...

  19. Association of airborne moisture-indicating microorganisms withbuilding-related symptoms and water damage in 100 U.S. office buildings:Analyses of the U.S. EPA BASE data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Lei, Quanhong; Cozen, Myrna O.; Shendell, DerekG.; Macher, Janet M.; Tsai, Feng C.

    2003-10-01

    Metrics of culturable airborne microorganisms for either total organisms or suspected harmful subgroups have generally not been associated with symptoms among building occupants. However, the visible presence of moisture damage or mold in residences and other buildings has consistently been associated with respiratory symptoms and other health effects. This relationship is presumably caused by adverse but uncharacterized exposures to moisture-related microbiological growth. In order to assess this hypothesis, we studied relationships in U.S. office buildings between the prevalence of respiratory and irritant symptoms, the concentrations of airborne microorganisms that require moist surfaces on which to grow, and the presence of visible water damage. For these analyses we used data on buildings, indoor environments, and occupants collected from a representative sample of 100 U.S. office buildings in the U.S. Environmental Protection Agency's Building Assessment Survey and Evaluation (EPA BASE) study. We created 19 alternate metrics, using scales ranging from 3-10 units, that summarized the concentrations of airborne moisture-indicating microorganisms (AMIMOs) as indicators of moisture in buildings. Two were constructed to resemble a metric previously reported to be associated with lung function changes in building occupants; the others were based on another metric from the same group of Finnish researchers, concentration cutpoints from other studies, and professional judgment. We assessed three types of associations: between AMIMO metrics and symptoms in office workers, between evidence of water damage and symptoms, and between water damage and AMIMO metrics. We estimated (as odds ratios (ORs) with 95% confidence intervals) the unadjusted and adjusted associations between the 19 metrics and two types of weekly, work-related symptoms--lower respiratory and mucous membrane--using logistic regression models. Analyses used the original AMIMO metrics and were

  20. Associations of indoor carbon dioxide concentrations, VOCS, environmental susceptibilities with mucous membrane and lower respiratory sick building syndrome symptoms in the BASE study: Analyses of the 100 building dataset

    Energy Technology Data Exchange (ETDEWEB)

    Apte, M.G.; Erdmann, C.A.

    2002-10-01

    Using the 100 office-building Building Assessment Survey and Evaluation (BASE) Study dataset, we performed multivariate logistic regression analyses to quantify the associations between indoor minus outdoor CO{sub 2} (dCO{sub 2}) concentrations and mucous membrane (MM) and lower respiratory system (Lresp) Sick Building Syndrome (SBS) symptoms, adjusting for age, sex, smoking status, presence of carpet in workspace, thermal exposure, relative humidity, and a marker for entrained automobile exhaust. Using principal components analysis we identified a number of possible sources of 73 measured volatile organic compounds in the office buildings, and assessed the impact of these VOCs on the probability of presenting the SBS symptoms. Additionally we included analysis adjusting for the risks for predisposition of having SBS symptoms associated with the allergic, asthmatic, and environmentally sensitive subpopulations within the office buildings. Adjusted odds ratios (ORs) for statistically significant, dose-dependant associations (p<0.05) for dry eyes, sore throat, nose/sinus congestion, and wheeze symptoms with 100-ppm increases in dCO{sub 2} ranged from 1.1 to 1.2. These results suggest that increases in the ventilation rates per person among typical office buildings will, on average significantly reduce the prevalence of several SBS symptoms, up to 80%, even when these buildings meet the existing ASHRAE ventilation standards for office buildings. VOC sources were observed to play an role in direct association with mucous membrane and lower respiratory irritation, and possibly to be indirectly involved in indoor chemical reactions with ozone that produce irritating compounds associated with SBS symptoms. O-xylene, possibly emitted from furniture coatings was associated with shortness of breath (OR at the maximum concentration = 8, p < 0.05). The environmental sensitivities of a large subset of the office building population add to the overall risk of SBS symptoms (ORs

  1. A Problematic Family Reunion of a Chinese-American in China: Issues of Face Abstract: As one of the heated topics in the intercultural communication studies, face issues have aroused world-wide attention in the academic field. This paper analyses thr...%ract: As one of the heated topics in the intercultural communication studies, face issues have aroused world-wide attention in the academic field. This paper analyses three critical incidents based on two face theories proposed by Brown and Levinson ...

    Institute of Scientific and Technical Information of China (English)

    曹凤琴

    2012-01-01

    As one of the heated topics in the intercultural communication studies, face issues have aroused world-wide attention in the academic field. This paper analyses three critical incidents based on two face theories proposed by Brown and Levinson (1987) and Ting-Toomey and Kurogi(1998) respectively. Lastly, by focusing on the relationship between the rapport-threatening behavior and the face, this paper intends to cultivate our awareness of face and highlight the harmonious interpersonal relationship between different cultures.

  2. MR-based tridirectional flow imaging. Acquisition and 3D analysis of flows in the thoracic aorta; MRT-basierte tridirektionale Flussbildgebung. Aufnahme und 3D-Analyse von Stroemungen in der thorakalen Aorta

    Energy Technology Data Exchange (ETDEWEB)

    Unterhinninghofen, R. [Universitaet Karlsruhe, Institut fuer Technische Informatik, Karlsruhe (Germany); Deutsches Krebsforschungszentrum Heidelberg, Abteilung Radiologie (E010), Heidelberg (Germany); Ley, S. [Deutsches Krebsforschungszentrum Heidelberg, Abteilung Radiologie (E010), Heidelberg (Germany); Universitaetskinderklinik Heidelberg, Paediatrische Radiologie, Heidelberg (Germany); Frydrychowicz, A.; Markl, M. [Universitaetsklinikum Freiburg, Abteilung Roentgendiagnostik, Medizin Physik, Freiburg (Germany)

    2007-11-15

    Tridirectional MR flow imaging is a novel method that extends the well-established technique of phase-contrast flow measurement by vectorial velocity encoding, i.e., by encoding in all three spatial directions. Modern sequence protocols allow the acquisition of velocity vector fields with high spatial resolutions of 1-3 mm and temporal resolutions of 20-50 ms over the heart cycle. Using navigating techniques, data on the entire thoracic aorta can be acquired within about 20 min in free breathing. The subsequent computer-based data processing includes automatic correction of aliasing effects, eddy currents, gradient field inhomogeneities, and Maxwell terms. The data can be visualized in three dimensions using vector arrows, streamlines, or particle traces. The parallel visualization of morphological slices and of the surface of the vascular lumen in 3D enhances spatial and anatomical orientation. Furthermore, quantitative values such as blood flow velocity and volume, vorticity, and vessel wall shear stress can be determined. Modern software systems support the integrated flow-based analysis of typical aortic pathologies such as aneurysms and aortic insufficiency. To what extent this additional information will help us in making better therapeutic decisions needs to be studied in clinical trials. (orig.) [German] Die tridirektionale MRT-Flussbildgebung ist ein junges Verfahren, das die etablierte Phasenkontrastflussmessung um die vektorielle Geschwindigkeitskodierung, also Kodierung in allen 3 Raumrichtungen, erweitert. Moderne Sequenzen erfassen Geschwindigkeitsvektorfelder mit raeumlich hoher Aufloesung von 1-3 mm und ueber den Herzschlag mit einer zeitlichen Aufloesung von 20-50 ms. Dank Navigatortechnik kann die gesamte thorakale Aorta innerhalb von ca. 20 min in freier Atmung aufgenommen werden. Die anschliessende rechnergestuetzte Datenaufbereitung umfasst die automatische Korrektur von Aliasingeffekten, Wirbelstroemen, Gradientenfeldinhomogenitaeten und

  3. A model-based analysis for the improvement of electrical energy supply of future offshore windparks by means of biogas technology; Modellbasierte Analyse zur Verbesserung der elektrischen Energiebereitstellung zukuenftiger Offshore-Windparks mittels Biogastechnologie

    Energy Technology Data Exchange (ETDEWEB)

    Tigges, Martin

    2010-09-06

    Since the beginning of industrial revolution energy supply in Germany was based on fossil fuels. Climatic change, the greenhouse effect, the growth of population on the one hand and shortage on fossil fuels on the other hand and the will to be independent from importing resources from political instable countries are calling for serious changes. The primary modernisation of the current energy supply needs to target a sustainable, generation comprehensive solution. The use of renewable fuels is necessary. Wind energy usage is one of the most promising alternatives within the short-term to middle-term planning of environmental policy. As in Germany nearly all potential habitats on mainland already are tapped, significant development potential is seen at offshore habitats in North as well as Baltic Sea. Germany aims for the allotment of up to 25 GW of offshore wind energy in the North and Baltic Sea till 2025. Electricity generation by wind energy plants subjects to the conditions the volatile characteristics of the wind. Within the planned big offshore wind energy farms comprising more than 80 plants changes in wind velocity as well as directions will lead to fluctuations in energy supply. Up to now the plant resource scheduling needed to react on the users demand. The further development of wind energy and the differences between forecasted and actual wind energy feeds do and will hinder this resource scheduling. New approaches need to be found to be able to ensure the security of energy supplies in Germany. These approaches should be sustainable and therefore be based on the usage of renewable fuels. Within this dissertation possible solutions as well as constraints of the usage of biogas as a nearly CO2 neutral fuel in combination with micro gas turbines adapted to the usage with biogas plants to be able to align the differences between forecasted and actual wind energy supply for a future offshore wind energy plant park are analysed. This combination allows an

  4. La Interseccionalidad como Instrumento Analítico de Interpelación en la Violencia de Género (Intersectionality, a Methodological Tool for Analysing and Addressing Gender-based Violence

    Directory of Open Access Journals (Sweden)

    Raquel Guzman Ordaz

    2015-06-01

    Full Text Available This paper reviews peripheral feminism reflections upon the need to move beyond classic gender violence studies’ epistemological assumptions. Gender-based-analysis (GBA conceptualization represented a considerable advance in recognising patriarchy as a structural system of domination and its relevance in studying processes of violence against women. Nonetheless this approach exhibit limitations as we try to investigate further beyond simple analytical dichotomies so often deployed when discussing gender violence. Consequently we highlight the importance of relating gender to other Intersectional inequality axis such as social class, age, sexual identities different from the heteronormativity, functional diversity, ethnic group/race, or citizenship. An encompassing analytical perspective allows for a multidimensional framework more suitable for the study of such a complex phenomenon as is violence against women. In this manner we are able to visualize and analyse experiences that are usually marginalised and excluded from hegemonic definitions of gender violence. Este trabajo revisa las reflexiones de los feminismos periféricos en torno a la necesidad de avanzar sobre los presupuestos epistemológicos de los estudios de “violencia de género”. Si bien la conceptualización realizada desde los gender-based analysis (GBA ha significado un gran avance para el reconocimiento del patriarcado como sistema de dominación estructural dentro de los procesos de la violencia contra las mujeres, este enfoque se muestra limitado frente a la superación de dicotomías analíticas que se presentan a menudo en el estudio de la violencia de género. Por ello, se propone la integración interseccional del género con otros ejes de desigualdad, como clase social, edad, identidades sexuales (distintas de la heteronormatividad, diversidad funcional, raza/etnia o ciudadanía. Esta ampliación analítica proporciona la posibilidad de un enfoque multidimensional

  5. Automated Quality Assurance of Online NIR Analysers

    Directory of Open Access Journals (Sweden)

    Kari Aaljoki

    2005-01-01

    Full Text Available Modern NIR analysers produce valuable data for closed-loop process control and optimisation practically in real time. Thus it is highly important to keep them in the best possible shape. Quality assurance (QA of NIR analysers is an interesting and complex issue because it is not only the instrument and sample handling that has to be monitored. At the same time, validity of prediction models has to be assured. A system for fully automated QA of NIR analysers is described. The system takes care of collecting and organising spectra from various instruments, relevant laboratory, and process management system (PMS data. Validation of spectra is based on simple diagnostics values derived from the spectra. Predictions are validated against laboratory (LIMS or other online analyser results (collected from PMS. The system features automated alarming, reporting, trending, and charting functions for major key variables for easy visual inspection. Various textual and graphical reports are sent to maintenance people through email. The software was written with Borland Delphi 7 Enterprise. Oracle and PMS ODBC interfaces were used for accessing LIMS and PMS data using appropriate SQL queries. It will be shown that it is possible to take actions even before the quality of predictions is seriously affected, thus maximising the overall uptime of the instrument.

  6. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L.A.; Gilbert, M Thomas P; Hofreiter, Michael

    2013-01-01

    . To date, at least 124 partially or fully assembled mitogenomes from more than 20 species have been obtained, and, given the rapid progress in sequencing technology, this number is likely to dramatically increase in the future. The increased information content offered by analysing full mitogenomes...

  7. Analysing student teachers’ lesson plans

    DEFF Research Database (Denmark)

    Carlsen, Louise Meier

    2015-01-01

    I investigate 17 mathematics student teachers’ productions, in view of examining the synergy and interaction between their mathematical and didactical knowledge. The concrete data material consists in lesson plans elaborated for the final exam of a unit on “numbers, arithmetic and algebra”. The...... anthropological theory of the didactic is used as a framework to analyse these components of practical and theoretical knowledge....

  8. Beskrivende analyse af mekaniske systemer

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    Descriptive analysis is the activity, where a given product is analysed for obtaining insight into different aspects, leading to an explicit description of each of these aspects. This textbook is worked out for course 72101 Produktanalyse (Analysis of products) given at DTU....

  9. Phylogenetic analyses of Andromedeae (Ericaceae subfam. Vaccinioideae).

    Science.gov (United States)

    Kron, K A; Judd, W S; Crayn, D M

    1999-09-01

    Phylogenetic relationships within the Andromedeae and closely related taxa were investigated by means of cladistic analyses based on phenotypic (morphology, anatomy, chromosome number, and secondary chemistry) and molecular (rbcL and matK nucleotide sequences) characters. An analysis based on combined molecular and phenotypic characters indicates that the tribe is composed of two major clades-the Gaultheria group (incl. Andromeda, Chamaedaphne, Diplycosia, Gaultheria, Leucothoë, Pernettya, Tepuia, and Zenobia) and the Lyonia group (incl. Agarista, Craibiodendron, Lyonia, and Pieris). Andromedeae are shown to be paraphyletic in all analyses because the Vaccinieae link with some or all of the genera of the Gaultheria group. Oxydendrum is sister to the clade containing the Vaccinieae, Gaultheria group, and Lyonia group. The monophyly of Agarista, Lyonia, Pieris, and Gaultheria (incl. Pernettya) is supported, while that of Leucothoë is problematic. The close relationship of Andromeda and Zenobia is novel and was strongly supported in the molecular (but not morphological) analyses. Diplycosia, Tepuia, Gaultheria, and Pernettya form a well-supported clade, which can be diagnosed by the presence of fleshy calyx lobes and methyl salicylate. Recognition of Andromedeae is not reflective of our understanding of geneological relationships and should be abandoned; the Lyonia group is formally recognized at the tribal level. PMID:10487817

  10. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  11. Analyse du discours et archive

    OpenAIRE

    Maingueneau, Dominique

    2007-01-01

    Les recherches qui se réclament de "l’analyse du discours" connaissent un développement considérable dans le monde entier ; en revanche, "l’école française d’analyse du discours" (AD) traverse une crise d’identité depuis le début des années 80. Dans cet exposé nous voudrions explorer les raisons de cette crise, puis préciser le concept d’archive qui, à notre sens, permet de prolonger la voie ouverte à la fin des années 1960. Mais il ne s’agit que d’une des voies possibles, dès lors que, comme...

  12. Reliability of chemical analyses of water samples

    Energy Technology Data Exchange (ETDEWEB)

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  13. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial standa...... financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  14. Tematisk analyse af amerikansk hiphop

    OpenAIRE

    Tranberg-Hansen, Katrine; Bøgh Larsen, Cecilie; Jeppsson,Louise Emilie; Lindberg Kirkegaard, Nanna; Funch Madsen, Signe; Bülow Bach, Maria

    2013-01-01

    This paper examines the possible development in the function of American hiphop. It focuses on specific themes like ghetto, freedom, rebellion, and racial discrimination in hiphop music. To investigate this possible development two text analysis methods are used: a pragmatic and a stylistic text analysis, and a historical method is used: a source criticism. A minimal amount of literature has been published on how hiphop culture arose. The-­‐ se studies, however, make it possible to analyse...

  15. Materials characterization of radioactive waste forms using a multi-element detection method based on the instrumental neutron activation analysis. MEDINA; Stoffliche Charakterisierung radioaktiver Abfallprodukte durch ein Multi-Element-Analyseverfahren basierend auf der instrumentellen Neutronen-Aktivierungs-Analyse. MEDINA

    Energy Technology Data Exchange (ETDEWEB)

    Havenith, Andreas Wilhelm

    2015-07-01

    the identification and quantification of toxic elements in radioactive waste forms. The physical basis of MEDINA is the Prompt- and Delayed-Gamma-Neutron-Activation-Analysis (P and DGNAA). The neutron activation analysis of material samples in the gram range is state-of-the-art of science and technology under use of thermal or cold neutrons at research reactors. The thereof retrieved nuclear data and the results of the feasibility study for the characterization of large-volume samples up to a volume of 50 l /1-5/ are the scientific basis of the present dissertation. With a newly developed test facility and an innovative algorithms for a rotationally dependent analysis the element quantification of larger inhomogeneous samples can be performed by taking into account the gamma and neutron self-shielding for the first time. A test facility for the chemical characterisation of 200-l-drums was built and several homogeneous and inhomogeneous samples with a waste matrix of concrete were analysed to validate the measurement technique. The conceptual design of the MEDINA test facility is based on stochastic simulations studies with the computer code MCNP. For a measurement the drum of interest is positioned on a turntable inside an irradiation chamber made exclusively of graphite, acting as neutron moderator and reflector. The drum is irradiated with 14 MeV neutrons produced by a deuterium-tritium (D-T) neutron-generator operating in pulse mode. The prompt and delayed gamma rays, induced by neutron reactions occurring at different times after the neutron pulses, are measured with a high-purity germanium (HPGe) detector placed in a wall of the irradiation chamber perpendicular to the neutron generator. The HPGe detector signals are processed through an appropriate nuclear electronics. The gamma rays spectra are recorded for each discrete drum rotation, which allows to investigate the sample homogeneity. The developed algorithm for the element quantification is based on the

  16. Fouling analyses for heat exchangers of NPP

    International Nuclear Information System (INIS)

    Fouling of heat exchanges is generated by water-borne deposits, commonly known as foulants including particulate matter from the air, migrated corrosion produces; silt, clays, and sand suspended in water; organic contaminants; and boron based deposits in plants. This fouling is known to interfere with normal flow characteristics and reduce thermal efficiencies of heat exchangers. In order to analyze the fouling for heat exchangers of nuclear power plant, the fouling factor is introduced based on the ASME O and M codes and TEMA standards. This paper focuses on the fouling analyses for the heat exchangers of several primary systems; the RHR heat exchanger of the residual heat removal system, the letdown heat exchanger of the chemical and volume control system, and the CCW heat exchanger of the component cooling water system, Based on the results of the fouling levels for the three heat exchangers are assumed

  17. New insights into domestication of carrot from root transcriptome analyses

    NARCIS (Netherlands)

    Rong, J.; Lammers, Y.; Strasburg, J.L.; Schidlo, N.S.; Ariyurek, Y.; Jong, de T.J.; Klinkhamer, P.G.L.; Smulders, M.J.M.; Vrieling, K.

    2014-01-01

    Background - Understanding the molecular basis of domestication can provide insights into the processes of rapid evolution and crop improvement. Here we demonstrated the processes of carrot domestication and identified genes under selection based on transcriptome analyses. Results - The root transcr

  18. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis

  19. A stratigraphical-geochemical study on the Chaco Paraná continental rift basin- An approach study based on regional sedimentology and drill-hole core analyses,South América

    Institute of Scientific and Technical Information of China (English)

    Roberto Torra

    2006-01-01

    This paper is focused on a geologic "regional rift basin system pattern" and its stratigraphical-geochemical relationship. This is mainly based on the littoral shallow marine sedimentary succession paleogeography and deposits. These successions characterize the large extensional intracratonic Chaco rift basin system evolved from the Upper Cretaceous ( Late Campanian-Senonian-Maastrichtian-Early Paleocene) to Quaternary time. The siliciclastic littoral shallow marine successions were deposited from Early Senonian-Maastrichtian to Late Miocene during three main successive littoral shallow marine transgressions of continental extension.These transgressions happened over the wide pediplanized terrains of South America. These lands exist west of the more positive areas, between the Brazilian Shield and the foreland massifs that were settled in the more westernwards areas. Later, these regional foreland massifs were coupled and raised to the Andean Orogen Belt during the last 5 million years.The extensive intracratonic pediplanized low topographic relief areas were the reservoirs of siliciclastic littoral shallow marine succession deposits during the three successive widespread vast continental littoral shallow marine transgressions.The first transgression began at the Latest Campanian-Senonian and/or Early Maastrichtian time. After this episode, the sedimentary depositional systems continued during the Cenozoic until the Latest Miocene. These successions constitute a major allostratigraphic unit.The limit with underlying units is the regional unconformity between the regional volcanic event (Jurassic-Cretacic and interleaved eolianite sandstones) at the base and the undifferentiated Quaternary sediments (called as the Pampeano and Post-Pampeano Formations sensu lato). Based on many facies analyses there had been checked out different levels in the eustatic sea level variations within the allostratigraphic unit.Three major stages of extensional climax were recognized and

  20. Analysing ESP Texts, but How?

    Directory of Open Access Journals (Sweden)

    Borza Natalia

    2015-03-01

    Full Text Available English as a second language (ESL teachers instructing general English and English for specific purposes (ESP in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary school students need to know in terms of language to process academic textbooks cannot avoiding deal with a dilemma. It needs to be decided which way it is most appropriate to analyse the texts in question. Handbooks of English applied linguistics are not immensely helpful with regard to this problem as they tend not to give recommendation as to which major text analytical approaches are advisable to follow in a pre-college setting. The present theoretical research aims to address this lacuna. Respectively, the purpose of this pedagogically motivated theoretical paper is to investigate two major approaches of ESP text analysis, the register and the genre analysis, in order to find the more suitable one for exploring the language use of secondary school subject texts from the point of view of an English as a second language teacher. Comparing and contrasting the merits and limitations of the two contrastive approaches allows for a better understanding of the nature of the two different perspectives of text analysis. The study examines the goals, the scope of analysis, and the achievements of the register perspective and those of the genre approach alike. The paper also investigates and reviews in detail the starkly different methods of ESP text analysis applied by the two perspectives. Discovering text analysis from a theoretical and methodological angle supports a practical aspect of English teaching, namely making an informed choice when setting out to analyse

  1. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  2. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  3. Economical analyses in interventional radiology

    International Nuclear Information System (INIS)

    Considerations about the relation between benefit and expenses are also gaining increasing importance in interventional radiology. This review aims at providing a survey about the published data concerning economical analyses of some of the more frequently employed interventions in radiology excluding neuroradiological and coronary interventions. Because of the relative scarcity of literature in this field, all identified articles (n=46) were included without selection for methodological quality. For a number of radiological interventions the cost-effectiveness has already been demonstrated, e.g., PTA of femoropopliteal and iliac artery stenoses, stenting of renal artery stenoses, placement of vena-cava filters, as well as metal stents in malignant biliary and esophageal obstructions. Conflicting data exist for the treatment of abdominal aortic aneurysms. So far, no analysis could be found that directly compares bypass surgery versus PTA+stent in iliac arteries. (orig.)

  4. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  5. Analyse des besoins des usagers

    OpenAIRE

    KHOUDOUR,L; LANGLAIS,A; Charpentier, C.; MOTTE,C; PIAN,C

    2002-01-01

    Il s'agit d'étendre la surveillance vidéo de l'enceinte du métro vers l'intérieur des rames. Les images captées constituent des prises de vue des événements qui se déroulent à l'intérieur des véhicules afin notamment d'améliorer la sécurité des usagers transportes. Il est possible de mémoriser les images des quelques instants précédant un incident usager, d'analyser ces images en temps différé et de mieux appréhender en temps réel le comportement des usagers face à des événements ou des consi...

  6. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2016-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...... reasons why those competing views fail provide important insights into the ethics of killing....

  7. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  8. London Education and Inclusion Project (LEIP): Exploring Negative and Null Effects of a Cluster-Randomised School-Intervention to Reduce School Exclusion—Findings from Protocol-Based Subgroup Analyses

    Science.gov (United States)

    Obsuth, Ingrid; Cope, Aiden; Sutherland, Alex; Pilbeam, Liv; Murray, Aja Louise; Eisner, Manuel

    2016-01-01

    This paper presents subgroup analyses from the London Education and Inclusion Project (LEIP). LEIP was a cluster-randomised controlled trial of an intervention called Engage in Education-London (EiE-L) which aimed to reduce school exclusions in those at greatest risk of exclusion. Pupils in the control schools attended an hour-long employability seminar. Minimisation was used to randomly assign schools to treatment and control following baseline data collection. The study involved 36 schools (17 in treatment—373 pupils; 19 in control—369 pupils) with >28% free school meal eligibility across London and utilised on pupil self-reports, teacher reports as well as official records to assess the effectiveness of EiE-L. Due to multiple data sources, sample sizes varied according to analysis. Analyses of pre-specified subgroups revealed null and negative effects on school exclusion following the intervention. Our findings suggest that the design and implementation of EiE-L may have contributed to the negative outcomes for pupils in the treatment schools when compared to those in the control schools. These findings call into question the effectiveness of bolt-on short-term interventions with pupils, particularly those at the highest risk of school exclusion and when they are faced with multiple problems. This is especially pertinent given the possibility of negative outcomes. Trial Registration: Controlled Trials: ISRCTN23244695 PMID:27045953

  9. Learner as Statistical Units of Analyses

    Directory of Open Access Journals (Sweden)

    Vivek Venkatesh

    2011-01-01

    Full Text Available Educational psychologists have researched the generality and specificity of metacognitive monitoring in the context of college-level multiple-choice tests, but fairly little is known as to how learners monitor their performance on more complex academic tasks. Even lesser is known about how monitoring proficiencies such as discrimination and bias might be related to key self-regulatory processes associated with task understanding. This quantitative study explores the relationship between monitoring proficiencies and task understanding in 39 adult learners tackling ill-structured writing tasks for a graduate “theories of e-learning” course. Using learner as unit of analysis, the generality of monitoring is confirmed through intra-measure correlation analyses while facets of its specificity stand out due to the absence of inter-measure correlations. Unsurprisingly, learner-based correlational and repeated measures analyses did not reveal how monitoring proficiencies and task understanding might be related. However, using essay as unit of analysis, ordinal and multinomial regressions reveal how monitoring influences different levels of task understanding. Results are interpreted not only in light of novel procedures undertaken in calculating performance prediction capability but also in the application of essay-based, intra-sample statistical analysis that reveal heretofore unseen relationships between academic self-regulatory constructs.

  10. Advanced handbook for accident analyses of German nuclear power plants

    International Nuclear Information System (INIS)

    The advanced handbook of safety analyses (HSA) comprises a comprehensive electronic collection of knowledge for the compilation and conduction of safety analyses in the area of reactor, plant and containment behaviour as well as results of existing safety analyses (performed by GRS in the past) with characteristic specifications and further background information. In addition, know-how from the analysis software development and validation process is presented and relevant rules and regulations with regard to safety demonstration are provided. The HSA comprehensively covers the topic thermo-hydraulic safety analyses (except natural hazards, man-made hazards and malicious acts) for German pressurized and boiling water reactors for power and non-power operational states. In principle, the structure of the HSA-content represents the analytical approach utilized by safety analyses and applying the knowledge from safety analyses to technical support services. On the basis of a multilevel preparation of information to the topics ''compilation of safety analyses'', ''compilation of data bases'', ''assessment of safety analyses'', ''performed safety analyses'', ''rules and regulation'' and ''ATHLET-validation'' the HSA addresses users with different background, allowing them to enter the HSA at different levels. Moreover, the HSA serves as a reference book, which is designed future-oriented, freely configurable related to the content, completely integrated into the GRS internal portal and prepared to be used by a growing user group.

  11. Assessment of anti-inflammatory and anti-arthritic properties of Acmella uliginosa (Sw.) Cass. based on experiments in arthritic rat models and qualitative gas chromatography-mass spectrometry analyses

    Science.gov (United States)

    Paul, Subhashis; Sarkar, Sudeb; Dutta, Tanmoy; Bhattacharjee, Soumen

    2016-01-01

    Aim: The principle objective of the study was to explore the anti-arthritic properties of Acmella uliginosa (AU) (Sw.) Cass. flower in a rat model and to identify potential anti-inflammatory compounds derived from flower extracts. The synergistic role played by a combination of AU flower and Aloe vera (AV) gel crude extracts was also investigated. Materials and Methods: Male Wistar rats induced with Freund’s complete adjuvant (FCA) were used as a disease model of arthritic paw swelling. There were three experimental and two control groups, each consisting of five rats. Paw circumference and serum biochemical parameters were evaluated to investigate the role of the flower extracts in disease amelioration through a feeding schedule spanning 21 days. Gas chromatography/mass spectrometry (GC/MS) analyses were performed to search for the presence of anti-inflammatory compounds in the ethanolic and n-hexane solvent extracts of the flower. Results: As a visual cue to the experimental outcomes, FCA-induced paw swelling decreased to the normal level; and hemoglobin, serum protein, and albumin levels were significantly increased in the treated animals. The creatinine level was estimated to be normal in the experimental rats after the treatment. The combination of AU and AV showed the best recovery potential in all the studied parameters, confirming the synergistic efficacy of the herbal formulation. GC/MS analyses revealed the presence of at least 5 anti-inflammatory compounds including 9-octadecenoic acid (Z)-, phenylmethyl ester, astaxanthin, à-N-Normethadol, fenretinide that have reported anti-inflammatory/anti-arthritic properties. Conclusion: Our findings indicated that the crude flower homogenate of AU contains potential anti-inflammatory compounds which could be used as an anti-inflammatory/anti-arthritic medication. PMID:27366352

  12. Liberalisation in network based industries. An economic analysis by case studies of railway, telecommunication and energy utilities; Liberalisierung von Netzindustrien. Eine oekonomische Analyse am Beispiel der Eisenbahn, der Telekommunikation und der Leitungsgebundenen Energieversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Schulze, A.

    2006-07-01

    The liberalisation of network based industries represents an economic problem, which raises on the one hand a multiplicity of theoretically unresolved questions and for which exist now on the other hand experiences in economic policy in Germany. The causes of the economic problems are not only to be found thereby in certain industry characteristics of the network based industries, but also in a missed special treatment of the network based economic sectors in the past by the economic policy. However, competition pushes in network based industries at borders, because the infrastructure necessary for the production of network based services typically represents an non-open to attack, natural monopoly in the hand of an established, vertically integrated supplier. From it extensive possibilities for the discrimination of competitors result, the competition political action need draw. It applies to analyze these in the available work and to discuss alternative solutions of the discrimination problem. (orig.)

  13. Reliability analyses used by maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rusek, S.; Gono, R.; Kral, V.; Kratky, M. [VSB Technical Univ. of Ostrava, Poruba (Czech Republic)

    2008-07-01

    A series of studies have been conducted to analyze failures that have been experienced by most power distribution companies in the Czech Republic and in one of the Slovak Republics. The purpose was to find ways to optimize the maintenance of distribution network devices. Data was compiled to enable a comparison of results and to create a statistically more important database. Since the number of failures in the area of electrical power engineering have been rather small, the results on element reliability will only be available in several more years to come. The main challenge with reliability analysis is to find reliable and updated input data. As such, the primary task is to change the existing structure of databases of power distribution companies. These databases must be adjusted to get the input data for the calculation functions of reliability centred maintenance (RCM). This paper described the programs designed for analyses of reliability indices and the optimization of maintenance of equipment of the distribution system that will provide basic data for responsible and logical decisions regarding maintenance and basic data for the preparation of an effective maintenance schedule and the creation of a feedback system. 7 refs., 4 figs.

  14. Evaluation of Model Operational Analyses during DYNAMO

    Science.gov (United States)

    Ciesielski, Paul; Johnson, Richard

    2013-04-01

    A primary component of the observing system in the DYNAMO-CINDY2011-AMIE field campaign was an atmospheric sounding network comprised of two sounding quadrilaterals, one north and one south of the equator over the central Indian Ocean. During the experiment a major effort was undertaken to ensure the real-time transmission of these data onto the GTS (Global Telecommunication System) for dissemination to the operational centers (ECMWF, NCEP, JMA, etc.). Preliminary estimates indicate that ~95% of the soundings from the enhanced sounding network were successfully transmitted and potentially used in their data assimilation systems. Because of the wide use of operational and reanalysis products (e.g., in process studies, initializing numerical simulations, construction of large-scale forcing datasets for CRMs, etc.), their validity will be examined by comparing a variety of basic and diagnosed fields from two operational analyses (ECMWF and NCEP) to similar analyses based solely on sounding observations. Particular attention will be given to the vertical structures of apparent heating (Q1) and drying (Q2) from the operational analyses (OA), which are strongly influenced by cumulus parameterizations, a source of model infidelity. Preliminary results indicate that the OA products did a reasonable job at capturing the mean and temporal characteristics of convection during the DYNAMO enhanced observing period, which included the passage of two significant MJO events during the October-November 2011 period. For example, temporal correlations between Q2-budget derived rainfall from the OA products and that estimated from the TRMM satellite (i.e., the 3B42V7 product) were greater than 0.9 over the Northern Sounding Array of DYNAMO. However closer inspection of the budget profiles show notable differences between the OA products and the sounding-derived results in low-level (surface to 700 hPa) heating and drying structures. This presentation will examine these differences and

  15. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  16. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  17. Capitalisation et partage de connaissances d'analyse de traces numériques d'activités : Assister le suivi de l'activité dans les environnements de formation à base de simulateur pleine échelle

    OpenAIRE

    Champalle, Olivier

    2014-01-01

    Our research takes place in the field of knowledge engineering. In particularly we focus our study in capitalizing and sharing knowledge of observation and analysis of digital traces. In this context, we base our approach on the concept of modeled trace (M-Trace) developed by the SILEX team. Our approach give the possibility to exploit low levels digital traces in order to extract higher knowledge level through rule-based transformations. These rules modelize the knowldege of observation and ...

  18. Capitalisation et partage de connaissances d’analyse de traces numériques d’activités : assister le suivi de l'activité dans les environnements de formation à base de simulateur pleine échelle

    OpenAIRE

    Champalle, Olivier

    2014-01-01

    Our research takes place in the field of knowledge engineering. In particularly we focus our study in capitalizing and sharing knowledge of observation and analysis of digital traces. In this context, we base our approach on the concept of modeled trace (M-Trace) developed by the SILEX team. Our approach give the possibility to exploit low levels digital traces in order to extract higher knowledge level through rule-based transformations. These rules modelize the knowldege of observation and ...

  19. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B4C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate

  20. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  1. NOx analyser interefence from alkenes

    Science.gov (United States)

    Bloss, W. J.; Alam, M. S.; Lee, J. D.; Vazquez, M.; Munoz, A.; Rodenas, M.

    2012-04-01

    Nitrogen oxides (NO and NO2, collectively NOx) are critical intermediates in atmospheric chemistry. NOx abundance controls the levels of the primary atmospheric oxidants OH, NO3 and O3, and regulates the ozone production which results from the degradation of volatile organic compounds. NOx are also atmospheric pollutants in their own right, and NO2 is commonly included in air quality objectives and regulations. In addition to their role in controlling ozone formation, NOx levels affect the production of other pollutants such as the lachrymator PAN, and the nitrate component of secondary aerosol particles. Consequently, accurate measurement of nitrogen oxides in the atmosphere is of major importance for understanding our atmosphere. The most widely employed approach for the measurement of NOx is chemiluminescent detection of NO2* from the NO + O3 reaction, combined with NO2 reduction by either a heated catalyst or photoconvertor. The reaction between alkenes and ozone is also chemiluminescent; therefore alkenes may contribute to the measured NOx signal, depending upon the instrumental background subtraction cycle employed. This interference has been noted previously, and indeed the effect has been used to measure both alkenes and ozone in the atmosphere. Here we report the results of a systematic investigation of the response of a selection of NOx analysers, ranging from systems used for routine air quality monitoring to atmospheric research instrumentation, to a series of alkenes ranging from ethene to the biogenic monoterpenes, as a function of conditions (co-reactants, humidity). Experiments were performed in the European Photoreactor (EUPHORE) to ensure common calibration, a common sample for the monitors, and to unequivocally confirm the alkene (via FTIR) and NO2 (via DOAS) levels present. The instrument responses ranged from negligible levels up to 10 % depending upon the alkene present and conditions used. Such interferences may be of substantial importance

  2. Analyses of containment structures with corrosion damage

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, J.L. [Sandia National Labs., Albuquerque, NM (United States)

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  3. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  4. What can we do about exploratory analyses in clinical trials?

    Science.gov (United States)

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials. PMID:26390962

  5. Reconstructing Meat Consumption through Biomarker Analyses of Paleofeces

    OpenAIRE

    Jenna M. Battillo; Abigail E. Fisher

    2015-01-01

    This mini-review outlines three underutilized approaches for studying meat-based biomarkers in archaeological paleofeces that we expect will increase in significance within the field. Myoglobin, stable isotope, and aDNA analyses all have untapped potential to inform meat-based dietary constituents.

  6. Communication analyses of plant operator crews

    International Nuclear Information System (INIS)

    Elucidation of crew communication aspects is required to improve the man-man interface which supports operators' diagnoses and decisions. Experiments to clarify operator performance under abnormal condition were evaluated by protocol analyses, interviews, etc. using a training simulator. We had the working hypothesis, based on experimental observations, that operator performance can be evaluated by analysis of crew communications. The following four approaches were tried to evaluate operator performance. (1) Crew performance was quantitatively evaluated by the number of tasks undertaken by an operator crew. (2) The group thinking process was clarified by cognition-communication flow. (3) The group response process was clarified by movement flow. (4) Quantitative indexes for evaluating crew performance were considered to be represented by the amount of information effectively exchanged among operators. (author)

  7. Analysing Attrition in Outsourced Software Project

    Directory of Open Access Journals (Sweden)

    Umesh Rao Hodeghatta

    2015-01-01

    Full Text Available Information systems (IS outsourcing has grown as a major business phenomenon, and widely accepted as a business tool. Software outsourcing c ompanies provide expertise, knowledge and capabilities to their clients by taking up the proj ects both onsite and offsite. These companies face numerous challenges including attrition of pro ject members. Attrition is a major challenge experienced by the outsourcing companies as it has severe impact on business, revenues and profitability. In this paper, attrition data of a m ajor software outsourcing company was analysed and an attempt to find the reason for attr ition is also made. The data analysis was based on the data collected by an outsourcing compa ny over a period of two years for a major client. The results show that the client initiated attrition can have an impact on project and the members quit the outsourcing company due to client initiated ramp down without revealing the reason.

  8. Liver volume, intrahepatic fat and body weight in the course of a lifestyle interventional study. Analysis with quantitative MR-based methods; Lebervolumen, Leberfettanteil und Koerpergewicht im Verlauf einer Lebensstilinterventionsstudie. Eine Analyse mit quantitativen MR-basierten Methoden

    Energy Technology Data Exchange (ETDEWEB)

    Bongers, M.N. [Klinikum der Eberhard-Karls-Universitaet Tuebingen, Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Stefan, N.; Fritsche, A.; Haering, H.U. [Universitaetsklinikum Tuebingen, Innere Medizin IV - Endokrinologie und Diabetologie, Angiologie, Nephrologie und Klinische Chemie, Tuebingen (Germany); Helmholtz-Zentrum Muenchen an der Universitaet Tuebingen, Institut fuer Diabetes-Forschung und Metabolische Erkrankungen (IDM), Tuebingen (Germany); Nikolaou, K. [Klinikum der Eberhard-Karls-Universitaet Tuebingen, Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Schick, F. [Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Machann, J. [Universitaetsklinikum Tuebingen, Sektion fuer Experimentelle Radiologie der Abteilung fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Helmholtz-Zentrum Muenchen an der Universitaet Tuebingen, Institut fuer Diabetes-Forschung und Metabolische Erkrankungen (IDM), Tuebingen (Germany); Deutsches Zentrum fuer Diabetesforschung (DZD), Neuherberg (Germany)

    2015-04-01

    The aim of this study was to investigate potential associations between changes in liver volume, the amount of intrahepatic lipids (IHL) and body weight during lifestyle interventions. In a prospective study 150 patients with an increased risk for developing type 2 diabetes mellitus were included who followed a caloric restriction diet for 6 months. In the retrospective analysis 18 women and 9 men (age range 22-71 years) with an average body mass index (BMI) of 32 kg/m{sup 2} were enrolled. The liver volume was determined at the beginning and after 6 months by three-dimensional magnetic resonance imaging (3D-MRI, echo gradient, opposed-phase) and IHLs were quantified by volume-selective MR spectroscopy in single voxel stimulated echo acquisition mode (STEAM). Univariable and multivariable correlation analyses between changes of liver volume (Δliver volume), intrahepatic lipids (ΔIHL) and body weight (ΔBW) were performed. Univariable correlation analysis in the whole study cohort showed associations between ΔIHL and ΔBW (r = 0.69; p < 0.0001), ΔIHL and Δliver volume (r = 0.66; p = 0.0002) as well as ΔBW and Δliver volume (r = 0.5; p = 0.0073). Multivariable correlation analysis revealed that changes of liver volume are primarily determined by changes in IHL independent of changes in body weight (β = 0.0272; 95 % CI: 0.0155-0.034; p < 0.0001). Changes of liver volume during lifestyle interventions are independent of changes of body weight primarily determined by changes of IHL. These results show the reversibility of augmented liver volume in steatosis if it is possible to reduce IHLs during lifestyle interventions. (orig.) [German] Lassen sich Zusammenhaenge zwischen den Aenderungen des Lebervolumens, des Anteils intrahepatischer Lipide und des Koerpergewichts waehrend einer Lebensstilintervention feststellen ?In einer prospektiven Interventionsstudie unterzogen sich 150 Probanden mit erhoehtem Diabetesrisiko fuer 6 Monate einer diaetetischen

  9. Uncertainty analyses in systems modeling process

    International Nuclear Information System (INIS)

    In the context of Probabilistic Safety Assessment (PSA), the uncertainty analyses play an important role. The objective is to ensure the qualitative evaluation and quantitative estimation in PSA level 1 results (the core damage frequency, the accident sequences frequency, the top events probability, etc). An application that enables uncertainty calculations by probability distribution propagations in the fault tree model has been developed. It uses the moment method and Monte Carlo method. The application has been integrated into a computer program that allocates the reliability data, quantifies the human errors and labels in a unique way the components. The reliability data used in Institute for Nuclear Research (INR) Pitesti for Cernavoda Probabilistic Safety Evaluation (CPSE) studies is a generic data base. Taking into account the status of reliability data base and the cases by which an error factor for a failure rate lognormal distribution is calculated, the data base has been completed with an error factor for each record. This paper presents the module that performs the uncertainty analysis and an example of uncertainty analysis at the fault tree level. (authors)

  10. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  11. La Interseccionalidad como Instrumento Analítico de Interpelación en la Violencia de Género (Intersectionality, a Methodological Tool for Analysing and Addressing Gender-based Violence)

    OpenAIRE

    Raquel Guzman Ordaz; María Luisa Jiménez Rodrigo

    2015-01-01

    This paper reviews peripheral feminism reflections upon the need to move beyond classic gender violence studies’ epistemological assumptions. Gender-based-analysis (GBA) conceptualization represented a considerable advance in recognising patriarchy as a structural system of domination and its relevance in studying processes of violence against women. Nonetheless this approach exhibit limitations as we try to investigate further beyond simple analytical dichotomies so often deployed when...

  12. 10 CFR 436.24 - Uncertainty analyses.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank...

  13. Virus-Induced Gene Silencing-Based Functional Analyses Revealed the Involvement of Several Putative Trehalose-6-Phosphate Synthase/Phosphatase Genes in Disease Resistance against Botrytis cinerea and Pseudomonas syringae pv. tomato DC3000 in Tomato

    Science.gov (United States)

    Zhang, Huijuan; Hong, Yongbo; Huang, Lei; Liu, Shixia; Tian, Limei; Dai, Yi; Cao, Zhongye; Huang, Lihong; Li, Dayong; Song, Fengming

    2016-01-01

    Trehalose and its metabolism have been demonstrated to play important roles in control of plant growth, development, and stress responses. However, direct genetic evidence supporting the functions of trehalose and its metabolism in defense response against pathogens is lacking. In the present study, genome-wide characterization of putative trehalose-related genes identified 11 SlTPSs for trehalose-6-phosphate synthase, 8 SlTPPs for trehalose-6-phosphate phosphatase and one SlTRE1 for trehalase in tomato genome. Nine SlTPSs, 4 SlTPPs, and SlTRE1 were selected for functional analyses to explore their involvement in tomato disease resistance. Some selected SlTPSs, SlTPPs, and SlTRE1 responded with distinct expression induction patterns to Botrytis cinerea and Pseudomonas syringae pv. tomato (Pst) DC3000 as well as to defense signaling hormones (e.g., salicylic acid, jasmonic acid, and a precursor of ethylene). Virus-induced gene silencing-mediated silencing of SlTPS3, SlTPS4, or SlTPS7 led to deregulation of ROS accumulation and attenuated the expression of defense-related genes upon pathogen infection and thus deteriorated the resistance against B. cinerea or Pst DC3000. By contrast, silencing of SlTPS5 or SlTPP2 led to an increased expression of the defense-related genes upon pathogen infection and conferred an increased resistance against Pst DC3000. Silencing of SlTPS3, SlTPS4, SlTPS5, SlTPS7, or SlTPP2 affected trehalose level in tomato plants with or without infection of B. cinerea or Pst DC3000. These results demonstrate that SlTPS3, SlTPS4, SlTPS5, SlTPS7, and SlTPP2 play roles in resistance against B. cinerea and Pst DC3000, implying the importance of trehalose and tis metabolism in regulation of defense response against pathogens in tomato. PMID:27540389

  14. Multi-component pre-stack time-imaging and migration-based velocity analysis in transversely isotropic media; Imagerie sismique multicomposante et analyse de vitesse de migration en milieu transverse isotrope

    Energy Technology Data Exchange (ETDEWEB)

    Gerea, C.V.

    2001-06-01

    Complementary to the recording of compressional (P-) waves, the observation of P-S converted waves has recently been receiving specific attention. This is mainly due to their tremendous potential as a tool for fracture and lithology characterization, imaging sediments in gas saturated rocks, and imaging shallow sediments with higher resolution than conventional P-P data. In a conventional marine seismic survey, we cannot record P-to-S converted-wave energy since the fluids cannot support shear-wave strain. Thus, to capture the converted-wave energy, we need to record it at the water-bottom casing an ocean-bottom cable (OBC). The S-waves recorded at the seabed are mainly converted from P to S (i.e., PS-waves or C-waves) at the subsurface reflectors. The most accurate way to image seismic data is pre-stack depth migration. In this thesis, I develop a numerically efficient 2.5-D true-amplitude elastic Kirchhoff pre-stack migration algorithm designed to handle OBC data gathered along a single line. All the kinematic and dynamic elastic Green's functions required in the computation of true-amplitude weight term of Kirchhoff summation, are based on the non-hyperbolic explicit approximations of P- and SV-wave travel-times in layered transversely isotropic (VTI) media. Hence, this elastic imaging algorithm is very well-suited for migration-based velocity analysis techniques, for which fast, robust and iterative pre-stack migration is desired. In this thesis, I approach also the topic of anisotropic velocity model building for elastic pre-stack time-imaging. and propose an original methodology for joint PP-PS migration-based velocity analysis (MVA) in layered VTI anisotropic media. Tests on elastic synthetic and real OBC seismic data ascertain the validity of the pre-stack migration algorithm and velocity analysis methodology. (author)

  15. BWR stability analyses at BNL

    International Nuclear Information System (INIS)

    The March 9, 1988 instability at the LaSalle County-2 boiling water reactor power plant at Seneca, IL was simulated with Brookhaven National Laboratory's (BNL's) Engineering Plant Analyzer (EPA) for the purpose of demonstrating that the EPA is suitable for simulating large-amplitude, limit-cycle power and flow oscillations. It was shown in fall of 1988, by comparing all the available plant data from the STARTREC recording system of LaSalle-2 with EPA simulation results, that the EPA reproduces the LaSalle-2 oscillations without the use of stabilizing or destabilizing model or parameter modifications. The power vs. flow map of the LaSalle-2 plant was also reproduced at five lines of constant control rod positions. The LaSalle-2 stability boundary was established with the EPA and confirmed within ±15% accuracy by comparing the EPA results with the results of the frequency domain code LAPUR of Oak Ridge National Laboratory. Comparisons of EPA simulation results with plant data from three Peach Bottom stability tests show an agreement, based on mean and standard deviation, of -10±28%, -1±40% and +28±52% (low power) in the gain of the pressure to power transfer functions. This demonstrates that the time domain code HIPA in the EPA is capable of simulating instabilities

  16. Zamak samples analyses using EDXRF

    International Nuclear Information System (INIS)

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  17. Zamak samples analyses using EDXRF

    Energy Technology Data Exchange (ETDEWEB)

    Assis, J.T. de; Lima, I.; Monin, V., E-mail: joaquim@iprj.uerj.b, E-mail: inaya@iprj.uerj.b, E-mail: monin@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica e Energia; Anjos, M. dos; Lopes, R.T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Alves, H., E-mail: marcelin@uerj.b, E-mail: haimon.dlafis@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada e Termodinamica

    2009-07-01

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  18. Spectroscopic analyses on interaction of o-Vanillin- D-Phenylalanine, o-Vanillin- L-Tyrosine and o-Vanillin- L-Levodopa Schiff Bases with bovine serum albumin (BSA)

    Science.gov (United States)

    Gao, Jingqun; Guo, Yuwei; Wang, Jun; Wang, Zhiqiu; Jin, Xudong; Cheng, Chunping; Li, Ying; Li, Kai

    2011-04-01

    In this work, three o-Vanillin Schiff Bases (o-VSB: o-Vanillin- D-Phenylalanine (o-VDP), o-Vanillin- L-Tyrosine (o-VLT) and o-Vanillin- L-Levodopa (o-VLL)) with alanine constituent were synthesized by direct reflux method in ethanol solution, and then were used to study the interaction to bovine serum albumin (BSA) molecules by fluorescence spectroscopy. Based on the fluorescence quenching calculation, the bimolecular quenching constant ( Kq), apparent quenching constant ( Ksv), effective binding constant ( KA) and corresponding dissociation constant ( KD) as well as binding site number ( n) were obtained. In addition, the binding distance ( r) was also calculated according to Foster's non-radioactive energy transfer theory. The results show that these three o-VSB can efficiently bind to BSA molecules, but the binding array order is o-VDP-BSA > o-VLT-BSA > o-VLL-BSA. Synchronous fluorescence spectroscopy indicates that the o-VDP is more accessibility to tryptophan (Trp) residues of BSA molecules than to tyrosine (Tyr) residues. Nevertheless, the o-VLT and o-VLL are more accessibility to Tyr residues than to Trp residues.

  19. Sources of Groundwater Based on Helium Analyses in and near the Freshwater/Saline-Water Transition Zone of the San Antonio Segment of the Edwards Aquifer, South-Central Texas, 2002-03

    Science.gov (United States)

    Hunt, Andrew G.; Lambert, Rebecca B.; Fahlquist, Lynne

    2010-01-01

    This report evaluates dissolved noble gas data, specifically helium-3 and helium-4, collected by the U.S. Geological Survey, in cooperation with the San Antonio Water System, during 2002-03. Helium analyses are used to provide insight into the sources of groundwater in the freshwater/saline-water transition zone of the San Antonio segment of the Edwards aquifer. Sixty-nine dissolved gas samples were collected from 19 monitoring wells (categorized as fresh, transitional, or saline on the basis of dissolved solids concentration in samples from the wells or from fluid-profile logging of the boreholes) arranged in five transects, with one exception, across the freshwater/saline-water interface (the 1,000-milligrams-per-liter dissolved solids concentration threshold) of the Edwards aquifer. The concentration of helium-4 (the dominant isotope in atmospheric and terrigenic helium) in samples ranged from 63 microcubic centimeters per kilogram at standard temperature (20 degrees Celsius) and pressure (1 atmosphere) in a well in the East Uvalde transect to 160,587 microcubic centimeters per kilogram at standard temperature and pressure in a well in the Kyle transect. Helium-4 concentrations in the 10 saline wells generally increase from the western transects to the eastern transects. Increasing helium-4 concentrations from southwest to northeast in the transition zone, indicating increasing residence time of groundwater from southwest to northeast, is consistent with the longstanding conceptualization of the Edwards aquifer in which water recharges in the southwest, flows generally northeasterly (including in the transition zone, although more slowly than in the fresh-water zone), and discharges at major springs in the northeast. Excess helium-4 was greater than 1,000 percent for 60 of the 69 samples, indicating that terrigenic helium is largely present and that most of the excess helium-4 comes from sources other than the atmosphere. The helium data of this report cannot be

  20. Analyse und Gestaltung von Distributionsprozessen im Felddaten-Wertstrom

    OpenAIRE

    Wenholt, Andreas

    2007-01-01

    The field data based knowledge, generated in the product life cycle, is used for product and process improvement. However the unsystematic way of getting and processing these data leads to high expenses regarding the information based decision making. In this dissertation a model for analysing and designing field data distribution processes is develop. This model based on two pillars. The first pillar is the field data value stream design, a methodical way of field data value stream visualisa...